Graph Neural Preconditioners for Iterative Solutions of Sparse Linear Systems

📅 2024-06-02
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Traditional algebraic preconditioners (e.g., ILU, AMG) suffer from failure on ill-conditioned large-scale sparse linear systems, exhibit unpredictable and expensive setup costs, and often rely on problem-specific physical priors. Method: We propose the first end-to-end differentiable, general-purpose preconditioner based on graph neural networks (GNNs). It encodes sparse matrices as graphs without requiring underlying physical knowledge, enabling strong generalization across diverse problem domains. Contribution/Results: The GNN-based preconditioner achieves highly predictable and significantly accelerated setup times compared to ILU and AMG. Integrated tightly with Krylov subspace methods (e.g., GMRES), it reduces iteration counts relative to inner-outer GMRES. Evaluated on 800+ real-world matrices spanning PDEs, economics, statistics, and graph learning, our approach consistently improves both solver efficiency and robustness—demonstrating superior scalability, generality, and practical applicability for large-scale sparse linear systems.

Technology Category

Application Category

📝 Abstract
Preconditioning is at the heart of iterative solutions of large, sparse linear systems of equations in scientific disciplines. Several algebraic approaches, which access no information beyond the matrix itself, are widely studied and used, but ill-conditioned matrices remain very challenging. We take a machine learning approach and propose using graph neural networks as a general-purpose preconditioner. They show attractive performance for many problems and can be used when the mainstream preconditioners perform poorly. Empirical evaluation on over 800 matrices suggests that the construction time of these graph neural preconditioners (GNPs) is more predictable and can be much shorter than that of other widely used ones, such as ILU and AMG, while the execution time is faster than using a Krylov method as the preconditioner, such as in inner-outer GMRES. GNPs have a strong potential for solving large-scale, challenging algebraic problems arising from not only partial differential equations, but also economics, statistics, graph, and optimization, to name a few.
Problem

Research questions and friction points this paper is trying to address.

Sparse Linear Systems
Iterative Methods
Preconditioning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph Neural Networks
Sparse Linear Systems
Iterative Methods Acceleration
🔎 Similar Papers
No similar papers found.
J
Jie Chen
MIT-IBM Watson AI Lab, IBM Research