Recurrent Equivariant Constraint Modulation: Learning Per-Layer Symmetry Relaxation from Data

πŸ“… 2026-02-02
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge that strict equivariance constraints, while improving generalization, often hinder optimization, and existing relaxation strategies rely on task-specific hyperparameters that incur high tuning costs. To overcome this, the authors propose Recursive Equivariance Constraint Modulation (RECM), a novel mechanism that adaptively learns the degree of equivariance relaxation for each layer directly from training signals and the symmetry of input–target distributions, without requiring any prior knowledge. Theoretically, they establish that the convergence upper bound of RECM is governed by the symmetry gap, enabling a principled balance between symmetry preservation and model flexibility. Empirical results demonstrate that RECM consistently outperforms existing methods across diverse exact and approximate equivariance tasks, achieving particularly strong performance on the GEOM-Drugs molecular conformation generation benchmark.

Technology Category

Application Category

πŸ“ Abstract
Equivariant neural networks exploit underlying task symmetries to improve generalization, but strict equivariance constraints can induce more complex optimization dynamics that can hinder learning. Prior work addresses these limitations by relaxing strict equivariance during training, but typically relies on prespecified, explicit, or implicit target levels of relaxation for each network layer, which are task-dependent and costly to tune. We propose Recurrent Equivariant Constraint Modulation (RECM), a layer-wise constraint modulation mechanism that learns appropriate relaxation levels solely from the training signal and the symmetry properties of each layer's input-target distribution, without requiring any prior knowledge about the task-dependent target relaxation level. We demonstrate that under the proposed RECM update, the relaxation level of each layer provably converges to a value upper-bounded by its symmetry gap, namely the degree to which its input-target distribution deviates from exact symmetry. Consequently, layers processing symmetric distributions recover full equivariance, while those with approximate symmetries retain sufficient flexibility to learn non-symmetric solutions when warranted by the data. Empirically, RECM outperforms prior methods across diverse exact and approximate equivariant tasks, including the challenging molecular conformer generation on the GEOM-Drugs dataset.
Problem

Research questions and friction points this paper is trying to address.

equivariant neural networks
symmetry relaxation
constraint modulation
optimization dynamics
layer-wise relaxation
Innovation

Methods, ideas, or system contributions that make the work stand out.

equivariant neural networks
symmetry relaxation
constraint modulation
symmetry gap
recurrent modulation
πŸ”Ž Similar Papers
No similar papers found.