PoS - Proceedings of Science
Volume 287 - The 25th International workshop on vertex detectors (Vertex 2016) - Session: Poster
Fast pattern recognition of ATLAS L1 track trigger for HL-LHC
M. Martensson*  on behalf of the ATLAS Collaboration
Full text: pdf
Pre-published on: February 09, 2017
Published on: August 03, 2017
Abstract
A fast hardware based track trigger is being developed in ATLAS for the High Luminosity upgrade of the Large Hadron Collider. The goal is to achieve trigger levels in the high pile-up conditions of the High Luminosity Large Hadron Collider that are similar or better than those achieved at low pile-up conditions by adding tracking information to the ATLAS hardware trigger. A method for fast pattern recognition using the Hough transform is investigated. In this method, detector hits are mapped onto a 2D parameter space with one parameter related to the transverse momentum and one to the initial track direction. The performance of the Hough transform is studied at different pile-up values. It is also compared, using full event simulation of events with average pile-up of 200, with a method based on matching detector hits to pattern banks of simulated tracks stored in a custom made Associative Memory ASICs. The pattern recognition is followed by a track fitting step which calculates the track parameters. The speed and precision of the track fitting depends on the quality of the hits selected by the pattern recognition step. The figures of merit of the pattern recognition are measured by the efficiency for finding hits from high transverse momentum tracks and the power of rejecting hits from low transverse momentum tracks and fake tracks.
DOI: https://doi.org/10.22323/1.287.0069
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.