🤖 AI Summary
Physics-informed neural networks (PINNs) suffer from unstable convergence under natural gradient optimization due to ill-conditioned Hessian spectra. Method: Building upon the ANaGRAM framework, this work systematically analyzes PINN training dynamics, revealing that truncated regularization critically suppresses low signal-to-noise-ratio modes in the frequency domain; it further establishes a theoretical necessity framework for regularization grounded in spectral theory and Green’s function analysis. Based on these insights, we propose a multi-truncation adaptive strategy that dynamically aligns regularization strength with the frequency-specific optimization requirements of different solution components. Results: Experiments across multiple PDE benchmarks demonstrate machine-precision convergence—significantly outperforming standard optimizers including Adam and L-BFGS—while achieving superior stability and generalization robustness.
📝 Abstract
Recent works have shown that natural gradient methods can significantly outperform standard optimizers when training physics-informed neural networks (PINNs). In this paper, we analyze the training dynamics of PINNs optimized with ANaGRAM, a natural-gradient-inspired approach employing singular value decomposition with cutoff regularization. Building on this analysis, we propose a multi-cutoff adaptation strategy that further enhances ANaGRAM's performance. Experiments on benchmark PDEs validate the effectiveness of our method, which allows to reach machine precision on some experiments. To provide theoretical grounding, we develop a framework based on spectral theory that explains the necessity of regularization and extend previous shown connections with Green's functions theory.