Group Averaging for Physics Applications: Accuracy Improvements at Zero Training Cost

๐Ÿ“… 2025-11-11
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
In scientific machine learning, models often neglect physical symmetries due to training difficulty or implementation complexity, yielding predictions that violate fundamental physical laws. To address this, we propose Test-time Group Averaging (TGA), a zero-cost inference-stage correction method that requires no architectural modification or additional training overhead. During inference, TGA applies all symmetry transformations to the input, performs forward passes for each, and averages the outputsโ€”thereby rigorously enforcing the target symmetry in predictions. We provide theoretical guarantees that TGA improves accuracy under common regularity conditions. Experiments across multiple PDE modeling paradigms demonstrate up to 37% reduction in VRMSE, consistent decreases in evaluation loss, improved adherence to physical constraints, and markedly enhanced visual quality of continuous dynamical trajectories. The core contribution is a training-free, general-purpose technique for exact symmetry enforcement on any pre-trained model.

Technology Category

Application Category

๐Ÿ“ Abstract
Many machine learning tasks in the natural sciences are precisely equivariant to particular symmetries. Nonetheless, equivariant methods are often not employed, perhaps because training is perceived to be challenging, or the symmetry is expected to be learned, or equivariant implementations are seen as hard to build. Group averaging is an available technique for these situations. It happens at test time; it can make any trained model precisely equivariant at a (often small) cost proportional to the size of the group; it places no requirements on model structure or training. It is known that, under mild conditions, the group-averaged model will have a provably better prediction accuracy than the original model. Here we show that an inexpensive group averaging can improve accuracy in practice. We take well-established benchmark machine learning models of differential equations in which certain symmetries ought to be obeyed. At evaluation time, we average the models over a small group of symmetries. Our experiments show that this procedure always decreases the average evaluation loss, with improvements of up to 37% in terms of the VRMSE. The averaging produces visually better predictions for continuous dynamics. This short paper shows that, under certain common circumstances, there are no disadvantages to imposing exact symmetries; the ML4PS community should consider group averaging as a cheap and simple way to improve model accuracy.
Problem

Research questions and friction points this paper is trying to address.

Enforcing exact symmetries in scientific ML models
Improving prediction accuracy without retraining costs
Applying group averaging to differential equation benchmarks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Group averaging enforces exact symmetries at test time
Averaging over symmetry groups improves model accuracy
Method requires no changes to model structure or training
๐Ÿ”Ž Similar Papers
No similar papers found.
V
Valentino F. Foit
Center for Cosmology and Particle Physics, Department of Physics, New York University, New York, NY 10003
D
David W. Hogg
Center for Cosmology and Particle Physics, Department of Physics, New York University, New York, NY 10003
Soledad Villar
Soledad Villar
Johns Hopkins University
mathematics of datageometric deep learningcomputational harmonic analysis