π€ AI Summary
Physical-informed neural networks (PINNs) suffer from severe ill-conditioning of the loss function and poor convergence when solving partial differential equations (PDEs) with sharp interfaces, primarily due to the numerical instability induced by differential operators. To address this, this paper proposes a homotopy continuation-based optimization framework. We introduce the NysNewton-CG second-order optimizerβa novel design that theoretically establishes, for the first time, the intrinsic relationship between the ill-conditioning of differential operators and the curvature of the PINN loss landscape. Additionally, we propose an Adam-L-BFGS hybrid strategy (AL), integrating residual-adaptive modeling with loss-landscape analysis. Experiments demonstrate that our method significantly improves convergence speed and solution accuracy for strongly nonlinear and high-order PDEs, effectively overcoming the failure of conventional first-order optimizers in sharp-interface scenarios.
π Abstract
This paper explores challenges in training Physics-Informed Neural Networks (PINNs), emphasizing the role of the loss landscape in the training process. We examine difficulties in minimizing the PINN loss function, particularly due to ill-conditioning caused by differential operators in the residual term. We compare gradient-based optimizers Adam, L-BFGS, and their combination al{}, showing the superiority of al{}, and introduce a novel second-order optimizer, NysNewton-CG (NNCG), which significantly improves PINN performance. Theoretically, our work elucidates the connection between ill-conditioned differential operators and ill-conditioning in the PINN loss and shows the benefits of combining first- and second-order optimization methods. Our work presents valuable insights and more powerful optimization strategies for training PINNs, which could improve the utility of PINNs for solving difficult partial differential equations.