AMStraMGRAM: Adaptive Multi-cutoff Strategy Modification for ANaGRAM

📅 2025-10-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Physics-informed neural networks (PINNs) suffer from unstable convergence under natural gradient optimization due to ill-conditioned Hessian spectra. Method: Building upon the ANaGRAM framework, this work systematically analyzes PINN training dynamics, revealing that truncated regularization critically suppresses low signal-to-noise-ratio modes in the frequency domain; it further establishes a theoretical necessity framework for regularization grounded in spectral theory and Green’s function analysis. Based on these insights, we propose a multi-truncation adaptive strategy that dynamically aligns regularization strength with the frequency-specific optimization requirements of different solution components. Results: Experiments across multiple PDE benchmarks demonstrate machine-precision convergence—significantly outperforming standard optimizers including Adam and L-BFGS—while achieving superior stability and generalization robustness.

Technology Category

Application Category

📝 Abstract
Recent works have shown that natural gradient methods can significantly outperform standard optimizers when training physics-informed neural networks (PINNs). In this paper, we analyze the training dynamics of PINNs optimized with ANaGRAM, a natural-gradient-inspired approach employing singular value decomposition with cutoff regularization. Building on this analysis, we propose a multi-cutoff adaptation strategy that further enhances ANaGRAM's performance. Experiments on benchmark PDEs validate the effectiveness of our method, which allows to reach machine precision on some experiments. To provide theoretical grounding, we develop a framework based on spectral theory that explains the necessity of regularization and extend previous shown connections with Green's functions theory.
Problem

Research questions and friction points this paper is trying to address.

Enhancing PINN training with adaptive natural gradient regularization
Improving cutoff strategy for neural network optimization dynamics
Developing spectral theory framework for regularization justification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-cutoff adaptation strategy enhances ANaGRAM performance
Spectral theory framework explains regularization necessity
Natural gradient method with cutoff regularization improves training
🔎 Similar Papers
No similar papers found.
N
Nilo Schwencke
LISN´Université Paris-Saclay ´INRIA-Saclay
C
Cyriaque Rousselot
LISN´Université Paris-Saclay ´INRIA-Saclay
Alena Shilova
Alena Shilova
Research Scientist, Inria Saclay
deep learningefficient trainingreinforcement learning
Cyril Furtlehner
Cyril Furtlehner
Inria
statistical physicsmachine learningcomplex systemstraffic forecasting