Lie Algebra Canonicalization: Equivariant Neural Operators under arbitrary Lie Groups

📅 2024-10-03
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Neural operators struggle to achieve strict equivariance under non-compact Lie groups (e.g., scaling, affine groups), as their symmetries are typically characterized only by infinitesimal generators—whereas existing equivariant architectures rely on global group structure. Method: We propose Lie Algebra Canonicalization (LieLAC), the first framework to directly embed infinitesimal generators into a canonicalization pipeline, establishing a theoretical link to frame averaging over continuous non-compact groups without requiring full group representations—enabling plug-and-play equivariance enhancement. Contributions/Results: Integrated with Lie-group optimization, symmetry-driven input normalization, and physics-informed neural networks (PINNs), LieLAC significantly improves generalization and physical consistency in invariant image classification and neural PDE solving. It provides a generic, scalable solution for equivariant modeling under arbitrary Lie point symmetries.

Technology Category

Application Category

📝 Abstract
The quest for robust and generalizable machine learning models has driven recent interest in exploiting symmetries through equivariant neural networks. In the context of PDE solvers, recent works have shown that Lie point symmetries can be a useful inductive bias for Physics-Informed Neural Networks (PINNs) through data and loss augmentation. Despite this, directly enforcing equivariance within the model architecture for these problems remains elusive. This is because many PDEs admit non-compact symmetry groups, oftentimes not studied beyond their infinitesimal generators, making them incompatible with most existing equivariant architectures. In this work, we propose Lie aLgebrA Canonicalization (LieLAC), a novel approach that exploits only the action of infinitesimal generators of the symmetry group, circumventing the need for knowledge of the full group structure. To achieve this, we address existing theoretical issues in the canonicalization literature, establishing connections with frame averaging in the case of continuous non-compact groups. Operating within the framework of canonicalization, LieLAC can easily be integrated with unconstrained pre-trained models, transforming inputs to a canonical form before feeding them into the existing model, effectively aligning the input for model inference according to allowed symmetries. LieLAC utilizes standard Lie group descent schemes, achieving equivariance in pre-trained models. Finally, we showcase LieLAC's efficacy on tasks of invariant image classification and Lie point symmetry equivariant neural PDE solvers using pre-trained models.
Problem

Research questions and friction points this paper is trying to address.

Enforcing equivariance in neural networks for PDE solvers
Handling non-compact symmetry groups in machine learning models
Integrating Lie group theory with pre-trained neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

LieLAC enforces equivariance via infinitesimal generators
Integrates canonicalization with pre-trained neural models
Applies Lie group descent for symmetry alignment
🔎 Similar Papers
No similar papers found.