A Lorentz-Equivariant Transformer for All of the LHC

📅 2024-11-01
🏛️ arXiv.org
📈 Citations: 25
Influential: 1
📄 PDF
🤖 AI Summary
The lack of Lorentz covariance in existing machine learning models severely limits their generalization in high-energy physics. Method: We propose the Lorentz-covariant Geometric Algebra Transformer (L-GATr), a strictly Lorentz-equivariant Transformer architecture built within the spacetime geometric algebra framework, where particle four-momenta serve as inputs. To our knowledge, this is the first work to achieve a unified Lorentz-covariant design across all major high-energy physics tasks—amplitude regression, jet classification, and generative modeling—while supporting controllable symmetry breaking. Geometric algebra representations and equivariant embeddings jointly enable both discriminative and generative capabilities. Contribution/Results: L-GATr achieves state-of-the-art performance on all three LHC benchmark tasks, significantly outperforming prior methods. These results empirically validate that explicit Lorentz-covariant inductive biases yield consistent, substantial gains across diverse high-energy physics machine learning applications.

Technology Category

Application Category

📝 Abstract
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The underlying architecture is a versatile and scalable transformer, which is able to break symmetries if needed. We demonstrate the power of L-GATr for amplitude regression and jet classification, and then benchmark it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find significant improvements over previous architectures.
Problem

Research questions and friction points this paper is trying to address.

Achieving Lorentz-equivariance in machine learning for LHC physics
Improving performance on amplitude regression and jet classification tasks
Developing first Lorentz-equivariant generative network for particle physics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lorentz-Equivariant Geometric Algebra Transformer architecture
Scalable transformer handling Lorentz transformations equivariance
Versatile architecture capable of breaking symmetries when needed
🔎 Similar Papers
No similar papers found.