Accelerating Natural Gradient Descent for PINNs with Randomized Numerical Linear Algebra

📅 2025-05-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses two key challenges in training physics-informed neural networks (PINNs) with natural gradient descent (NGD): the prohibitive computational cost of NGD and the slow convergence of conjugate gradient methods due to ill-conditioning of the Gramian matrix. We propose a novel matrix-free NGD framework leveraging randomized Nyström preconditioning. To our knowledge, this is the first application of randomized Nyström low-rank approximation to NGD optimization for PINNs—avoiding explicit construction or storage of the Gramian while effectively mitigating its ill-conditioning. By integrating matrix-free Hessian-vector products with randomized numerical linear algebra techniques, our method achieves significantly faster convergence and reduced training time across diverse partial differential equation (PDE) benchmarks, compared to existing NGD approaches. Moreover, it improves solution accuracy and stability, thereby enhancing the practical applicability of NGD in complex PDE modeling tasks.

Technology Category

Application Category

📝 Abstract
Natural Gradient Descent (NGD) has emerged as a promising optimization algorithm for training neural network-based solvers for partial differential equations (PDEs), such as Physics-Informed Neural Networks (PINNs). However, its practical use is often limited by the high computational cost of solving linear systems involving the Gramian matrix. While matrix-free NGD methods based on the conjugate gradient (CG) method avoid explicit matrix inversion, the ill-conditioning of the Gramian significantly slows the convergence of CG. In this work, we extend matrix-free NGD to broader classes of problems than previously considered and propose the use of Randomized Nystr""om preconditioning to accelerate convergence of the inner CG solver. The resulting algorithm demonstrates substantial performance improvements over existing NGD-based methods on a range of PDE problems discretized using neural networks.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational cost of Natural Gradient Descent for PINNs
Accelerating convergence of conjugate gradient method for ill-conditioned Gramian
Improving performance of NGD-based methods on PDE problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Randomized Nyström preconditioning accelerates CG
Matrix-free NGD avoids explicit matrix inversion
Extends NGD to broader problem classes
🔎 Similar Papers
No similar papers found.