Fast simulation with generative models at the LHC
L. Mijovic* and
On behalf of the ALICE, ATLAS, CMS and LHCb collaborations*: corresponding author
Pre-published on:
June 24, 2025
Published on:
—
Abstract
The increasing integrated luminosity of the data collected at the major Large Hadron Collider experiments - ALICE, ATLAS, CMS and LHCb - necessitates increasingly large simulated samples. Given that the computational resources won't grow proportionally to the integrated luminosity, how can the experiments produce these large samples? A key technique the experiments use to address this challenge is replacing traditional detector simulation with generative machine learning models. These generative models achieve O(10-1000) times improvements in computational efficiency while maintaining high accuracy. Specifically, I discuss four solutions: ALICE's simulation of Zero Degree Calorimeter with a Variational Autoencoder, ATLAS's use of Generative Adversarial Networks for calorimeter simulation, CMS's end-to-end FlashSim simulation based on Normalising Flows, and LHCb's Lamarr pipe-line employing Generative Adversarial Networks. The speed-up and physics performance achieved by these solutions cements the status of generative models as a viable, faster alternative to the established simulation techniques, which is an important step towards addressing the computational demands of the current and future LHC data analyses.
DOI: https://doi.org/10.22323/1.478.0156
How to cite
Metadata are provided both in
article format (very
similar to INSPIRE)
as this helps creating very compact bibliographies which
can be beneficial to authors and readers, and in
proceeding format which
is more detailed and complete.