🤖 AI Summary
This work addresses the challenge of modeling continuous symmetries in neural networks. Methodologically, it introduces a vector-field-based framework for discovering and enforcing Lie group symmetries, achieving the first automatic discovery of non-affine symmetries of neural network functions. It designs a vector-field-driven symmetry regularization mechanism and rigorously constrains the search space to infinitesimal isometries—i.e., Killing vector fields on Riemannian manifolds—to ensure physical consistency. Theoretically, we prove that this constraint guarantees both completeness and stability of symmetry discovery. Empirically, the approach demonstrates improved accuracy and data efficiency on physics simulation and image tasks. The core contribution lies in unifying the treatment of non-affine Lie group symmetries and establishing a tightly coupled modeling paradigm linking vector fields, Lie algebras, and isometric constraints.
📝 Abstract
Symmetry-informed machine learning can exhibit advantages over machine learning which fails to account for symmetry. Additionally, recent attention has been given to continuous symmetry discovery using vector fields which serve as infinitesimal generators for Lie group symmetries. In this paper, we extend the notion of non-affine symmetry discovery to functions defined by neural networks. We further extend work in this area by introducing symmetry enforcement of smooth models using vector fields. Finally, we extend work on symmetry discovery using vector fields by providing both theoretical and experimental material on the restriction of the symmetry search space to infinitesimal isometries.