OpInf-LLM: Parametric PDE Solving with LLMs via Operator Inference

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current large language models (LLMs) often exhibit limited generalization and struggle to simultaneously achieve high accuracy and success rates when solving partial differential equations (PDEs) with unseen parameters or boundary conditions. This work proposes OpInf-LLM, a novel framework that uniquely integrates operator inference with LLMs to construct reduced-order models from minimal solution data and enables PDE-solving tasks to be specified via natural language. Evaluated across diverse PDE settings, the method achieves substantially higher execution success rates and numerical accuracy compared to existing LLM-based approaches. OpInf-LLM offers low computational overhead, strong generalization capabilities, and a unified interface, establishing an efficient new paradigm for solving parametric PDEs.

Technology Category

Application Category

📝 Abstract
Solving diverse partial differential equations (PDEs) is fundamental in science and engineering. Large language models (LLMs) have demonstrated strong capabilities in code generation, symbolic reasoning, and tool use, but reliably solving PDEs across heterogeneous settings remains challenging. Prior work on LLM-based code generation and transformer-based foundation models for PDE learning has shown promising advances. However, a persistent trade-off between execution success rate and numerical accuracy arises, particularly when generalization to unseen parameters and boundary conditions is required. In this work, we propose OpInf-LLM, an LLM parametric PDE solving framework based on operator inference. The proposed framework leverages a small amount of solution data to enable accurate prediction of diverse PDE instances, including unseen parameters and configurations, and provides seamless integration with LLMs for natural language specification of PDE solving tasks. Its low computational demands and unified tool interface further enable a high execution success rate across heterogeneous settings. By combining operator inference with LLM capabilities, OpInf-LLM opens new possibilities for generalizable reduced-order modeling in LLM-based PDE solving.
Problem

Research questions and friction points this paper is trying to address.

PDE solving
large language models
operator inference
generalization
numerical accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Operator Inference
Large Language Models
Parametric PDE Solving
Reduced-Order Modeling
Natural Language Interface
🔎 Similar Papers
No similar papers found.