PoS - Proceedings of Science
Volume 340 - The 39th International Conference on High Energy Physics (ICHEP2018) - Parallel: Detector
Level-1 track finding with an all-FPGA system at CMS for the HL-LHC
T.O. James* on behalf of the CMS collaboration
*corresponding author
Full text: pdf
Published on: August 02, 2019
Abstract
The CMS experiment at the LHC is designed to study a wide range of high energy physics phenomena. It employs a large all-silicon tracker within a 3.8 T magnetic solenoid, which in particular allows precise measurements of transverse momentum ($p_\mathrm{T}$) and vertex position.

This tracking detector will be upgraded to coincide with the installation of the High-Luminosity LHC, which will provide a luminosity up to about $7.5 \times 10^{34}\,\mathrm{cm^{-2}\,s^{-1}}$ to CMS, or 200 collisions per 25 ns bunch crossing. This new tracker must maintain the nominal physics performance in this more challenging environment. Novel tracking modules that utilize closely spaced silicon sensors to discriminate on track $p_\mathrm{T}$ have been developed that will allow the readout of only hits compatible with $p_\mathrm{T}$$>$2-3 GeV tracks to off-detector trigger electronics. This will allow the use of tracking information at the Level-1 trigger of the experiment, a requirement to keep the Level-1 triggering rate below the 750 kHz target, while maintaining physics sensitivity.

This article presents a concept for an all FPGA based track finder using a time-multiplexed architecture. Hardware demonstrators have been assembled to prove the feasibility and capability of such a system. The performance for a variety of physics scenarios is discussed, as well as the proposed scaling of the demonstrators to the final system and new technologies.
DOI: https://doi.org/10.22323/1.340.0202
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.