Volume 501 - 39th International Cosmic Ray Conference (ICRC2025) - Cosmic-Ray Indirect
Reconstruction of cosmic-ray properties with uncertainty estimation using graph neural networks in GRAND
A. Ferriere*, A. Benoit-Levy and G. Collaboration
*: corresponding author
Full text: pdf
Pre-published on: September 23, 2025
Published on:
Abstract
The Giant Radio Array for Neutrino Detection (GRAND) aims to detect and study ultra-high- energy (UHE) neutrinos by observing the radio emissions produced in extensive air showers. The GRANDProto300 prototype primarily focuses on UHE cosmic rays to demonstrate the autonomous detection and reconstruction techniques that will later be applied to neutrino detection. In this work, we propose a method for reconstructing the arrival direction and energy with high precision using state-of-the-art machine learning techniques from noisy simulated voltage traces.

For each event, we represent the triggered antennas as a graph structure, which is used as input for a graph neural network (GNN). To significantly enhance precision and reduce the required training set size, we incorporate physical knowledge into both the GNN architecture and the input data. This approach achieves an angular resolution of 0.14° and a primary energy reconstruction resolution of about 15%. Additionally, we employ uncertainty estimation methods to improve the reliability of our predictions. These methods allow us to quantify the confidence of the GNN predictions and provide confidence intervals for the direction and energy reconstruction.

Finally, we explore strategies to evaluate the consistency and robustness of the model when applied to real data. Our goal is to identify situations where predictions remain trustworthy despite domain shifts between simulation and reality.
DOI: https://doi.org/10.22323/1.501.0253
How to cite

Metadata are provided both in article format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in proceeding format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.