PCGBandit: One-shot acceleration of transient PDE solvers via online-learned preconditioners

📅 2025-09-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Configuring preconditioners for transient partial differential equation (PDE) solvers remains largely empirical and non-adaptive, hindering robustness and efficiency across varying problem regimes. Method: This paper proposes PCGBandit—a framework that integrates preconditioned conjugate gradient (PCG) solvers with the OpenFOAM platform and employs a multi-armed bandit algorithm to *online* select optimal preconditioners and solver parameters *within a single simulation*, leveraging real-time residual feedback without offline training or historical data. Contribution/Results: Evaluated on multiple fluid dynamics and magnetohydrodynamics benchmarks, PCGBandit significantly reduces iteration counts and wall-clock time compared to conventional fixed-configuration solvers. It delivers one-shot, plug-and-play acceleration—achieving performance gains without requiring labeled datasets or prior training—thereby overcoming a key limitation of traditional data-driven preconditioning approaches.

Technology Category

Application Category

📝 Abstract
Data-driven acceleration of scientific computing workflows has been a high-profile aim of machine learning (ML) for science, with numerical simulation of transient partial differential equations (PDEs) being one of the main applications. The focus thus far has been on methods that require classical simulations to train, which when combined with the data-hungriness and optimization challenges of neural networks has caused difficulties in demonstrating a convincing advantage against strong classical baselines. We consider an alternative paradigm in which the learner uses a classical solver's own data to accelerate it, enabling a one-shot speedup of the simulation. Concretely, since transient PDEs often require solving a sequence of related linear systems, the feedback from repeated calls to a linear solver such as preconditioned conjugate gradient (PCG) can be used by a bandit algorithm to online-learn an adaptive sequence of solver configurations (e.g. preconditioners). The method we develop, PCGBandit, is implemented directly on top of the popular open source software OpenFOAM, which we use to show its effectiveness on a set of fluid and magnetohydrodynamics (MHD) problems.
Problem

Research questions and friction points this paper is trying to address.

Accelerating transient PDE solvers via online-learned preconditioners
Enabling one-shot speedup of simulations using solver's own data
Overcoming neural network training challenges in scientific computing workflows
Innovation

Methods, ideas, or system contributions that make the work stand out.

Online-learned adaptive preconditioners via bandit algorithm
One-shot acceleration using solver's own data
Integration with OpenFOAM for fluid and MHD problems
M
Mikhail Khodak
University of Wisconsin-Madison, United States
M
Min Ki Jung
Seoul National University, South Korea
B
Brian Wynne
Princeton University, United States
Edmond Chow
Edmond Chow
Georgia Institute of Technology
scientific computinghigh-performance computingnumerical methods
Egemen Kolemen
Egemen Kolemen
Princeton University
Plasma Control