SEAL - A Symmetry EncourAging Loss for High Energy Physics

📅 2025-11-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-energy physics, models often fail to strictly satisfy fundamental physical symmetries—such as Lorentz covariance—due to experimental resolution limits, resulting in poor robustness, low data efficiency, and weak interpretability. To address this, we propose a **soft symmetry-constrained loss function**, grounded in group-theoretic infinitesimal transformations and input-space perturbation regularization. This loss adaptively balances symmetry preservation during training without enforcing hard constraints, requiring no architectural modifications to existing neural networks—only integration at the loss layer. We validate our approach on top-quark jet classification and Lorentz-equivariant regression tasks. Results demonstrate substantial improvements in model robustness and few-shot generalization, while maintaining high accuracy and computational efficiency. Our method establishes a flexible, practical paradigm for symmetry-informed machine learning in physics, bridging theoretical principles with empirical deep learning performance.

Technology Category

Application Category

📝 Abstract
Physical symmetries provide a strong inductive bias for constructing functions to analyze data. In particular, this bias may improve robustness, data efficiency, and interpretability of machine learning models. However, building machine learning models that explicitly respect symmetries can be difficult due to the dedicated components required. Moreover, real-world experiments may not exactly respect fundamental symmetries at the level of finite granularities and energy thresholds. In this work, we explore an alternative approach to create symmetry-aware machine learning models. We introduce soft constraints that allow the model to decide the importance of added symmetries during the learning process instead of enforcing exact symmetries. We investigate two complementary approaches, one that penalizes the model based on specific transformations of the inputs and one inspired by group theory and infinitesimal transformations of the inputs. Using top quark jet tagging and Lorentz equivariance as examples, we observe that the addition of the soft constraints leads to more robust performance while requiring negligible changes to current state-of-the-art models.
Problem

Research questions and friction points this paper is trying to address.

Developing symmetry-aware ML models without exact enforcement
Addressing imperfect symmetry compliance in real-world experiments
Enhancing model robustness through soft symmetry constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Soft constraints enable symmetry-aware learning
Penalizes model based on input transformations
Uses group theory for infinitesimal transformations
🔎 Similar Papers
No similar papers found.
P
Pradyun Hebbar
Physics Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720, USA and Department of Particle Physics, University of Geneva, Geneva 1205, Switzerland
T
Thandikire Madula
University College London, Gower Street, London, WC1E 6BT, UK
Vinicius Mikuni
Vinicius Mikuni
Postdoctoral Scholar, LBNL
machine learningHEP
Benjamin Nachman
Benjamin Nachman
Staff Scientist, Lawrence Berkeley National Laboratory
Particle PhysicsDeep LearningQuantum ComputingSolid State Detectors
N
N. Outmezguine
Berkeley Center for Theoretical Physics, University of California, Berkeley, CA 94720, USA and Physics Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720, USA
I
I. Savoray
Berkeley Center for Theoretical Physics, University of California, Berkeley, CA 94720, USA and Physics Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720, USA