Towards a Foundation Model for Partial Differential Equations Across Physics Domains

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the lack of a cross-physics foundational model for partial differential equation (PDE) modeling, this paper introduces PDE-FM—the first unified pre-trained foundation model for multi-physics systems. PDE-FM jointly tokenizes spatial and spectral representations, incorporates physics-aware conditional mechanisms, employs a Mamba-based state-space backbone, and adopts an operator-theoretic decoder to enable coherent reasoning across spatial, spectral, and temporal dimensions. Crucially, it generalizes to unseen physical systems without architectural or data-level adaptation. Evaluated on 12 diverse 2D/3D multi-physics benchmarks, PDE-FM achieves state-of-the-art performance on six quantitative metrics, reducing average VRMSE by 46%. These results demonstrate substantial improvements in model reusability and cross-domain transfer capability, establishing a scalable foundation for physics-informed AI.

Technology Category

Application Category

📝 Abstract
We present PDE-FM, a modular foundation model for physics-informed machine learning that unifies spatial, spectral, and temporal reasoning across heterogeneous partial differential equation (PDE) systems. PDE-FM combines spatial-spectral tokenization, physics-aware conditioning, and a Mamba-based state-space backbone with an operator-theoretic decoder, enabling scalable and data-efficient modeling of complex physical dynamics. In contrast to task-specific neural operators, PDE-FM is pretrained once on diverse PDE datasets and can be transferred to new physical regimes without architectural or data-specific modifications. Evaluated on twelve 2D and 3D datasets from The Well benchmark - spanning hydrodynamic, radiative, elastic, and astrophysical phenomena - PDE-FM achieves state-of-the-art accuracy in six domains, reducing mean VRMSE by 46% relative to prior operator-learning baselines. The model demonstrates robust cross-physics generalization, excelling in turbulent and radiative systems while maintaining strong performance in linear and steady-state regimes. These results suggest that large-scale pretraining across diverse physical processes can yield transferable representations of dynamics, marking a step toward unified, foundation-level surrogates for multi-physics simulation and scientific discovery.
Problem

Research questions and friction points this paper is trying to address.

Develops a foundation model for diverse PDE systems
Enables scalable and data-efficient modeling of physical dynamics
Achieves cross-physics generalization without task-specific modifications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Modular foundation model for physics-informed machine learning
Combines spatial-spectral tokenization and physics-aware conditioning
Pretrained once on diverse PDE datasets for cross-physics generalization
🔎 Similar Papers
No similar papers found.
Eduardo Soares
Eduardo Soares
IBM Research Brazil, Sao Paulo, Brazil
Emilio Vital Brazil
Emilio Vital Brazil
IMPA; University of Calgary; IBM Research
Applied MathMachine LearningComputer GraphicsGraph TheoryVisual Computing
V
Victor Shirasuna
IBM Research Brazil, Sao Paulo, Brazil
B
Breno W. S. R. de Carvalho
IBM Research Brazil, Rio de Janeiro, Brazil
C
Cristiano Malossi
IBM Research Zurich, Rüschlikon, Switzerland