🤖 AI Summary
To address the slow convergence and poor generalization of iterative solvers for parametric partial differential equations (PDEs) on arbitrary unstructured meshes, this paper proposes a geometry-aware hybrid preconditioning framework. The method integrates finite-element mesh encoding with a novel Geo-DeepONet architecture—yielding the first neural preconditioner capable of cross-geometry generalization without retraining. Coupled with Krylov subspace methods (e.g., GMRES) and multilevel relaxation strategies, it enables geometry-adaptive iterative acceleration. Evaluated on parametric PDEs from elasticity and heat conduction, the framework reduces generalization error on unseen geometries by over 60%, accelerates overall solution time by 3–5×, and significantly improves robustness and transferability across diverse geometric domains.
📝 Abstract
The convergence behavior of classical iterative solvers for parametric partial differential equations (PDEs) is often highly sensitive to the domain and specific discretization of PDEs. Previously, we introduced hybrid solvers by combining the classical solvers with neural operators for a specific geometry 1, but they tend to under-perform in geometries not encountered during training. To address this challenge, we introduce Geo-DeepONet, a geometry-aware deep operator network that incorporates domain information extracted from finite element discretizations. Geo-DeepONet enables accurate operator learning across arbitrary unstructured meshes without requiring retraining. Building on this, we develop a class of geometry-aware hybrid preconditioned iterative solvers by coupling Geo-DeepONet with traditional methods such as relaxation schemes and Krylov subspace algorithms. Through numerical experiments on parametric PDEs posed over diverse unstructured domains, we demonstrate the enhanced robustness and efficiency of the proposed hybrid solvers for multiple real-world applications.