PoS - Proceedings of Science
Volume 444 - 38th International Cosmic Ray Conference (ICRC2023) - Gamma-ray Astronomy (GA)
Generating airshower images for the VERITAS telescopes with conditional Generative Adversarial Network
K.D. Hoang* and D.A. Williams
Full text: pdf
Pre-published on: July 25, 2023
Published on:
VERITAS (Very Energetic Radiation Imaging Telescope Array System) is the current-generation array comprising four 12-meter optical ground-based Imaging Atmospheric Cherenkov Telescopes (IACTs). Its primary goal is to indirectly observe gamma-ray emissions from the most violent astrophysical sources in the universe. Recent advancements in Machine Learning (ML) have sparked interest in utilizing neural networks (NNs) to directly infer properties from IACT images. However, the current training data for these NNs is generated through computationally expensive Monte Carlo (MC) simulation methods. This study presents a simulation method that employs conditional Generative Adversarial Networks (cGANs) to synthesize additional VERITAS data to facilitate training future NNs. In this test-of-concept study, we condition the GANs on five classes of simulated camera images consisting of circular muon showers and gamma-ray shower images in the first, second, third, and fourth quadrants of the camera. Our results demonstrate that by casting training data as time series, cGANs can 1) replicate shower morphologies based on the input class vectors and 2) generalize additional signals through interpolation in both the class and latent spaces. Leveraging GPUs strength, our method can synthesize novel signals at an impressive speed, generating over $10^6$ shower events in less than a minute.
DOI: https://doi.org/10.22323/1.444.0806
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.