🤖 AI Summary
Neural operator learning often suffers from accuracy bottlenecks, failing to surpass traditional reduced-order models. To address this, we propose CHONKNORIS—a novel method that first integrates Newton–Kantorovich iteration with operator learning: it regresses the Cholesky factor of the elliptic operator within Tikhonov-regularized iterative schemes, yielding an unfoldable, compression-guaranteed neural architecture capable of machine-precision solutions for both forward and inverse nonlinear PDEs. We further generalize it into FONKNORIS, a framework enabling strong generalization to unseen PDEs (e.g., Klein–Gordon and Sine–Gordon equations). Our approach unifies Cholesky factor regression, differentiable iterative unfolding, and mixture-of-experts integration, trained end-to-end. Evaluated on six challenging benchmarks—including nonlinear elliptic equations, Burgers equation, Darcy flow, Calderón problem, wave scattering, and seismic imaging—CHONKNORIS achieves machine precision across all tasks.
📝 Abstract
Neural operator learning methods have garnered significant attention in scientific computing for their ability to approximate infinite-dimensional operators. However, increasing their complexity often fails to substantially improve their accuracy, leaving them on par with much simpler approaches such as kernel methods and more traditional reduced-order models. In this article, we set out to address this shortcoming and introduce CHONKNORIS (Cholesky Newton--Kantorovich Neural Operator Residual Iterative System), an operator learning paradigm that can achieve machine precision. CHONKNORIS draws on numerical analysis: many nonlinear forward and inverse PDE problems are solvable by Newton-type methods. Rather than regressing the solution operator itself, our method regresses the Cholesky factors of the elliptic operator associated with Tikhonov-regularized Newton--Kantorovich updates. The resulting unrolled iteration yields a neural architecture whose machine-precision behavior follows from achieving a contractive map, requiring far lower accuracy than end-to-end approximation of the solution operator. We benchmark CHONKNORIS on a range of nonlinear forward and inverse problems, including a nonlinear elliptic equation, Burgers' equation, a nonlinear Darcy flow problem, the Calderón problem, an inverse wave scattering problem, and a problem from seismic imaging. We also present theoretical guarantees for the convergence of CHONKNORIS in terms of the accuracy of the emulated Cholesky factors. Additionally, we introduce a foundation model variant, FONKNORIS (Foundation Newton--Kantorovich Neural Operator Residual Iterative System), which aggregates multiple pre-trained CHONKNORIS experts for diverse PDEs to emulate the solution map of a novel nonlinear PDE. Our FONKNORIS model is able to accurately solve unseen nonlinear PDEs such as the Klein--Gordon and Sine--Gordon equations.