🤖 AI Summary
High-fidelity partial differential equation (PDE) simulations remain computationally prohibitive for querying, real-time control, and design optimization; while data-driven surrogate models offer speed, they suffer from poor generalizability and limited reliability. This paper introduces Neural Operator Warm-Starting (NOWS), a method that embeds a learned solution operator into classical Krylov subspace iterative solvers—such as conjugate gradient and GMRES—to provide high-quality initial guesses. NOWS preserves the original solver’s architecture and is agnostic to discretization schemes, supporting finite differences, finite elements, and isogeometric analysis while rigorously maintaining numerical stability and convergence guarantees. Evaluated on multiple benchmark PDE problems, NOWS reduces iteration counts substantially and achieves up to 90% end-to-end runtime reduction. It establishes a critical trade-off between speed and reliability—marking the first systematic integration of neural operators as warm-starting components within traditional iterative solvers, rather than as solver replacements.
📝 Abstract
Partial differential equations (PDEs) underpin quantitative descriptions across the physical sciences and engineering, yet high-fidelity simulation remains a major computational bottleneck for many-query, real-time, and design tasks. Data-driven surrogates can be strikingly fast but are often unreliable when applied outside their training distribution. Here we introduce Neural Operator Warm Starts (NOWS), a hybrid strategy that harnesses learned solution operators to accelerate classical iterative solvers by producing high-quality initial guesses for Krylov methods such as conjugate gradient and GMRES. NOWS leaves existing discretizations and solver infrastructures intact, integrating seamlessly with finite-difference, finite-element, isogeometric analysis, finite volume method, etc. Across our benchmarks, the learned initialization consistently reduces iteration counts and end-to-end runtime, resulting in a reduction of the computational time of up to 90 %, while preserving the stability and convergence guarantees of the underlying numerical algorithms. By combining the rapid inference of neural operators with the rigor of traditional solvers, NOWS provides a practical and trustworthy approach to accelerate high-fidelity PDE simulations.