🤖 AI Summary
The lack of Lorentz covariance in existing machine learning models severely limits their generalization in high-energy physics. Method: We propose the Lorentz-covariant Geometric Algebra Transformer (L-GATr), a strictly Lorentz-equivariant Transformer architecture built within the spacetime geometric algebra framework, where particle four-momenta serve as inputs. To our knowledge, this is the first work to achieve a unified Lorentz-covariant design across all major high-energy physics tasks—amplitude regression, jet classification, and generative modeling—while supporting controllable symmetry breaking. Geometric algebra representations and equivariant embeddings jointly enable both discriminative and generative capabilities. Contribution/Results: L-GATr achieves state-of-the-art performance on all three LHC benchmark tasks, significantly outperforming prior methods. These results empirically validate that explicit Lorentz-covariant inductive biases yield consistent, substantial gains across diverse high-energy physics machine learning applications.
📝 Abstract
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The underlying architecture is a versatile and scalable transformer, which is able to break symmetries if needed. We demonstrate the power of L-GATr for amplitude regression and jet classification, and then benchmark it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find significant improvements over previous architectures.