PoS - Proceedings of Science
Volume 367 - XXIX International Symposium on Lepton Photon Interactions at High Energies (LeptonPhoton2019) - Posters
Vertex Reconstruction and Deep Learning Applications in JUNO
L. Ziyuan*, Z. You, Y. Zhang, J. Zhu, S. Zhang  on behalf of the JUNO collaboration
Full text: pdf
Pre-published on: November 15, 2019
Published on: December 17, 2019
Abstract
The Jiangmen Underground Neutrino Observatory (JUNO), currently under construction in the south of China, will be the largest liquid scintillator (LS) detector in the world. JUNO is a multipurpose neutrino experiment designed to determine neutrino mass hierarchy, precisely measure oscillation parameters, and study solar neutrinos, supernova neutrinos, geo-neutrinos and atmospheric neutrinos. The central detector (CD) of JUNO contains 20,000 tons of LS and about 18,000 20-inch as well as 25,000 3-inch photomultiplier tubes (PMT). The energy resolution is expected to be $3\%/\sqrt{E\mathrm{(MeV)}}$. To meet the requirements of the experiment, two algorithms for the vertex reconstruction have been developed. One is the time likelihood method which utilizes the time and charge information of PMTs with good understanding of the complicated optical processes in the LS. The other is the deep learning method with the convolutional neural networks architecture, which is fast and avoids the need to consider in detail the optical processes. In general, similar performances of both methods are achieved. The deep learning method tends to give more accurate prediction near the detector border region, where the optical processes are more complicated.
DOI: https://doi.org/10.22323/1.367.0194
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.