PoS - Proceedings of Science
Volume 444 - 38th International Cosmic Ray Conference (ICRC2023) - Gamma-ray Astronomy (GA)
Neural Networks for Gamma Ray/Cosmic Ray Separation in Air Shower Observation with a Large Area Surface Scintillation Detector Array
S. Okukawa
Full text: pdf
Pre-published on: July 25, 2023
Published on:
The Tibet AS$\gamma$ experiment has been observing cosmic rays in the energy range from TeV to several tens of PeV using the Tibet-III air shower array since 1998.
In 2014, they added the underground water Cherenkov muon detector (MD) to separate cosmic gamma rays from the background cosmic rays, and started hybrid observation using these two detectors.
This study developed methods to separate gamma-ray-induced air showers and hadronic cosmic-ray-induced ones using the measured particle number density distribution to improve the sensitivity of cosmic gamma-ray measurement using the Tibet-III array data alone before the installation of the MD.
We tested two kinds of approaches based on Neural Networks.
The first method used feature values representing the shower particle spread from the measured particle number density distribution and the second method used image data.
In order to compare the separation performance of the each method, we analyzed Monte Carlo air shower events of vertically incident direction with mono initial energy gamma rays and protons.
A separation method with Multi-Layer Perceptron (MLP) based on multiple feature values has the AUC (Area Under the Curve) values of 0.748 for gamma-ray energy of 10 TeV and 0.845 for 100 TeV.
A separation method with Convolutional Neural Network (CNN) using the image data has the AUC values of 0.781 for gamma-ray energy of 10 TeV and 0.901 for 100 TeV, which are about 5 % higher than those of MLP.
DOI: https://doi.org/10.22323/1.444.0786
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.