PoS - Proceedings of Science
Volume 301 - 35th International Cosmic Ray Conference (ICRC2017) - Session Gamma-Ray Astronomy. GA-instrumentation
Probing Convolutional Neural Networks for Event Reconstruction in Gamma-Ray Astronomy with Cherenkov Telescopes
T.L. Holch*, I. Shilon, M. Büchele, T. Fischer, S. Funk, N. Groeger, D. Jankowsky, T. Lohse, U. Schwanke and P. Wagner
Full text: pdf
Pre-published on: August 16, 2017
Published on: August 03, 2018
Abstract
A dramatic progress in the field of computer vision has been made in recent years by applying deep learning techniques. State-of-the-art performance in image recognition is thereby reached with Convolutional Neural Networks (CNNs). CNNs are a powerful class of artificial neural networks, characterized by requiring fewer connections and free parameters than traditional neural networks and exploiting spatial symmetries in the input data. Moreover, CNNs have the ability to automatically extract general characteristic features from data sets and create abstract data representations which can perform very robust predictions. This suggests that experiments using Cherenkov telescopes could harness these powerful machine learning algorithms to improve the analysis of particle-induced air-showers, where the properties of primary shower particles are reconstructed from shower images recorded by the telescopes. In this work, we present initial results of a CNN-based analysis for background rejection and shower reconstruction, utilizing simulation data from the H.E.S.S. experiment. We concentrate on supervised training methods and outline the influence of image sampling on the performance of the CNN-model predictions.
DOI: https://doi.org/10.22323/1.301.0795
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.