A Neural-Operator Preconditioned Newton Method for Accelerated Nonlinear Solvers

📅 2025-11-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address stagnation or divergence of Newton’s method in solving parametric nonlinear systems caused by severe nonlinearity-induced imbalance, this paper proposes a neural operator-preconditioned Newton method. Our approach integrates deep learning priors into classical numerical optimization: (1) we design a fixed-point neural operator (FPNO) that learns, in an end-to-end differentiable manner, the mapping from the current iterate to the true solution—serving as a learned preconditioner; and (2) we introduce an adaptive negative-step mechanism that relaxes reliance on local convexity assumptions inherent in conventional line search and trust-region methods. The method significantly enhances convergence robustness and computational efficiency for strongly nonlinear, multiscale parametric problems. Experiments across multiple real-world physical modeling tasks demonstrate 2–5× speedup over classical solvers, with markedly reduced sensitivity to initial guesses.

Technology Category

Application Category

📝 Abstract
We propose a novel neural preconditioned Newton (NP-Newton) method for solving parametric nonlinear systems of equations. To overcome the stagnation or instability of Newton iterations caused by unbalanced nonlinearities, we introduce a fixed-point neural operator (FPNO) that learns the direct mapping from the current iterate to the solution by emulating fixed-point iterations. Unlike traditional line-search or trust-region algorithms, the proposed FPNO adaptively employs negative step sizes to effectively mitigate the effects of unbalanced nonlinearities. Through numerical experiments we demonstrate the computational efficiency and robustness of the proposed NP-Newton method across multiple real-world applications, especially for very strong nonlinearities.
Problem

Research questions and friction points this paper is trying to address.

Solving parametric nonlinear systems with unbalanced nonlinearities
Overcoming Newton iteration stagnation using neural operators
Adaptively employing negative step sizes for strong nonlinearities
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural preconditioned Newton method for nonlinear systems
Fixed-point neural operator learns direct solution mapping
Adaptive negative step sizes mitigate unbalanced nonlinearities