Multi-Preconditioned LBFGS for Training Finite-Basis PINNs

📅 2026-01-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the slow convergence and high communication overhead commonly encountered in training finite basis physics-informed neural networks (FBPINNs) by proposing a multi-preconditioned L-BFGS algorithm based on the nonlinear additive Schwarz method. Exploiting the intrinsic domain decomposition structure of FBPINNs, the approach constructs local quasi-Newton corrections in parallel across subdomains and optimally combines these updates through a low-dimensional subspace minimization problem. The resulting nonlinear multi-preconditioning mechanism effectively balances convergence rate, solution accuracy, and communication efficiency. Experimental results demonstrate that, compared to standard L-BFGS, the proposed method significantly accelerates convergence, improves model accuracy, and substantially reduces communication costs.

Technology Category

Application Category

📝 Abstract
A multi-preconditioned LBFGS (MP-LBFGS) algorithm is introduced for training finite-basis physics-informed neural networks (FBPINNs). The algorithm is motivated by the nonlinear additive Schwarz method and exploits the domain-decomposition-inspired additive architecture of FBPINNs, in which local neural networks are defined on subdomains, thereby localizing the network representation. Parallel, subdomain-local quasi-Newton corrections are then constructed on the corresponding local parts of the architecture. A key feature is a novel nonlinear multi-preconditioning mechanism, in which subdomain corrections are optimally combined through the solution of a low-dimensional subspace minimization problem. Numerical experiments indicate that MP-LBFGS can improve convergence speed, as well as model accuracy over standard LBFGS while incurring lower communication overhead.
Problem

Research questions and friction points this paper is trying to address.

physics-informed neural networks
finite-basis
domain decomposition
convergence acceleration
communication overhead
Innovation

Methods, ideas, or system contributions that make the work stand out.

multi-preconditioning
LBFGS
physics-informed neural networks
domain decomposition
quasi-Newton
🔎 Similar Papers
No similar papers found.