PoS - Proceedings of Science
Volume 395 - 37th International Cosmic Ray Conference (ICRC2021) - CRI - Cosmic Ray Indirect
Machine learning aided noise filtration and signal classification for CREDO experiment
Ł. Bibrzycki*, L. Bibrzycki, D. Alvarez-Castillo, O. Bar, D. Gora, P. Homola, P. Kovacs, M. Niedźwiecki, M. Piekarczyk, K. Rzecki, J. Stasielak, S. Stuglik, O. Sushchov and A. Tursunov
Full text: pdf
Pre-published on: August 04, 2021
Published on: March 18, 2022
The wealth of smartphone data collected by the Cosmic Ray Extremely Distributed Observatory (CREDO) greatly surpasses the capabilities of manual analysis. So, efficient means of rejecting the non-cosmic-ray noise and identification of signals attributable to extensive air showers are necessary. To address these problems we discuss a Convolutional Neural Network-based method of artefact rejection and complementary method of particle identification based on common statistical classifiers as well as their ensemble extensions. These approaches are based on supervised learning, so we need to provide a representative subset of the CREDO dataset for training and validation. According to this approach over 2300 images were chosen and manually labeled by 5 judges. The images were split into spot, track, worm (collectively named signals) and artefact classes. Then the preprocessing consisting of luminance summation of RGB channels (grayscaling) and background removal by adaptive thresholding was performed. For purposes of artefact rejection the binary CNN-based classifier was proposed which is able to distinguish between artefacts and signals. The classifier was fed with input data in the form of Daubechies wavelet transformed images.
In the case of cosmic ray signal classification, the well-known feature-based classifiers were considered. As feature descriptors, we used Zernike moments with additional feature related to total image luminance.
For the problem of artefact rejection, we obtained an accuracy of 99\%. For the 4-class signal classification, the best performing classifiers achieved a recognition rate of 88\%.
DOI: https://doi.org/10.22323/1.395.0227
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.