Deep learning techniques for high-precision neutral meson reconstruction in the LHCf experiment
E. Berti*, A. Paccagnella and G. Piparo
*: corresponding author
Full text: pdf
Pre-published on: January 16, 2026
Published on:
Abstract
The Large Hadron Collider forward (LHCf) experiment measures neutral particles produced in the very forward region of proton–proton collisions to constrain hadronic interaction models used in ultra high energy cosmic ray simulations. A key physics goal is the reconstruction of neutral mesons at very large pseudorapidity, in particular 𝐾0𝑠 → 𝜋0 𝜋0 → 4𝛾 decays, whose photons are highly collimated and tend to produce overlapping electromagnetic showers in the Small Tower (TS) and Large Tower (TL) of the Arm2 detector. This topology is difficult to handle with traditional methods based on simple energy sharing and centroid estimates. We present a multimodal deep learning strategy, generally applicable to neutral meson decays,
that is specifically optimised for the identification and reconstruction of forward 𝐾0𝑠 candidates. The pipeline combines calorimetric information and silicon microstrip profiles in a sequence of dedicated models for event selection, background rejection, topological classification of multi-photon patterns, and per-photon energy and position regression. The approach is developed
and validated on detailed simulations of the Arm2 response in proton–proton collisions at LHC energies and is designed to provide a clean and well characterised sample of forward 𝐾0𝑠 decays, improving the experimental sensitivity to neutral meson production for hadronic interaction studies in cosmic ray physics.
DOI: https://doi.org/10.22323/1.485.0665
How to cite

Metadata are provided both in article format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in proceeding format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.