TrackFormers Part 2: Enhanced Transformer-Based Models for High-Energy Physics Track Reconstruction

📅 2025-09-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In the high-luminosity LHC era, the deluge of detector hit data poses critical accuracy and efficiency bottlenecks for traditional particle track reconstruction. To address this, we propose the first end-to-end, single-stage Transformer model operating directly on hit-level inputs. Our method employs an encoder-only architecture, incorporates a physics-informed attention mechanism that explicitly models physical correlations among hits, introduces a novel loss function tailored for track reconstruction, and supports joint training across multiple physics processes. Trained on both synthetic and real detector data, the model achieves significant improvements in track purity and efficiency while accelerating inference—demonstrating strong scalability and robustness. This work establishes the first purely deep learning–based track reconstruction paradigm for next-generation high-energy physics experiments, eliminating reliance on intermediate clustering or heuristic steps.

Technology Category

Application Category

📝 Abstract
High-Energy Physics experiments are rapidly escalating in generated data volume, a trend that will intensify with the upcoming High-Luminosity LHC upgrade. This surge in data necessitates critical revisions across the data processing pipeline, with particle track reconstruction being a prime candidate for improvement. In our previous work, we introduced "TrackFormers", a collection of Transformer-based one-shot encoder-only models that effectively associate hits with expected tracks. In this study, we extend our earlier efforts by incorporating loss functions that account for inter-hit correlations, conducting detailed investigations into (various) Transformer attention mechanisms, and a study on the reconstruction of higher-level objects. Furthermore we discuss new datasets that allow the training on hit level for a range of physics processes. These developments collectively aim to boost both the accuracy, and potentially the efficiency of our tracking models, offering a robust solution to meet the demands of next-generation high-energy physics experiments.
Problem

Research questions and friction points this paper is trying to address.

Enhancing transformer models for particle track reconstruction in high-energy physics
Incorporating inter-hit correlations and studying transformer attention mechanisms
Improving tracking accuracy and efficiency for next-generation physics experiments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporated loss functions with inter-hit correlations
Investigated various Transformer attention mechanisms
Trained models on new datasets for multiple physics processes
🔎 Similar Papers
No similar papers found.
Sascha Caron
Sascha Caron
Associate Professor (UHD), Radboud University Nijmegen and Nikhef, Amsterdam
Particle PhysicsMachine LearningAstrophysicsArtificial IntelligenceDark Matter
N
Nadezhda Dobreva
High-Energy Physics, Radboud University, Nijmegen, The Netherlands
M
Maarten Kimpel
Institute for Computing and Information Sciences, Radboud University, Nijmegen, The Netherlands
U
Uraz Odyurt
Faculty of Engineering Technology, University of Twente, Enschede, The Netherlands
S
Slav Pshenov
National Institute for Subatomic Physics (Nikhef), Amsterdam, The Netherlands
R
Roberto Ruiz de Austri Bazan
Instituto de Física Corpuscular, IFIC-UV/CSIC, Valencia, Spain
E
Eugene Shalugin
High-Energy Physics, Radboud University, Nijmegen, The Netherlands
Z
Zef Wolffs
National Institute for Subatomic Physics (Nikhef), Amsterdam, The Netherlands
Y
Yue Zhao
High Performance Machine Learning, SURF , Amsterdam, The Netherlands