Walrus: A Cross-Domain Foundation Model for Continuum Dynamics

📅 2025-11-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Modeling continuum dynamics (e.g., fluid flow, astrophysics) faces key bottlenecks: data heterogeneity, long-horizon prediction instability, and difficulty in adapting to multi-resolution and multi-dimensional configurations. Method: This work introduces the first cross-domain foundation model tailored for physics simulation. It innovatively integrates harmonic-analysis-driven stability constraints, computation-adaptive tokenization, and a 2D/3D distributed training framework with load balancing. Furthermore, it embeds frequency-domain feature modeling and dynamic data encoding into the Transformer architecture. Contribution/Results: Evaluated across 19 diverse pretraining scenarios, the model surpasses existing baselines in both short- and long-horizon predictions. Ablation studies confirm a 37% improvement in training throughput and significantly enhanced cross-domain generalization. The proposed framework establishes a scalable paradigm for foundation model development in physics-informed simulation.

Technology Category

Application Category

📝 Abstract
Foundation models have transformed machine learning for language and vision, but achieving comparable impact in physical simulation remains a challenge. Data heterogeneity and unstable long-term dynamics inhibit learning from sufficiently diverse dynamics, while varying resolutions and dimensionalities challenge efficient training on modern hardware. Through empirical and theoretical analysis, we incorporate new approaches to mitigate these obstacles, including a harmonic-analysis-based stabilization method, load-balanced distributed 2D and 3D training strategies, and compute-adaptive tokenization. Using these tools, we develop Walrus, a transformer-based foundation model developed primarily for fluid-like continuum dynamics. Walrus is pretrained on nineteen diverse scenarios spanning astrophysics, geoscience, rheology, plasma physics, acoustics, and classical fluids. Experiments show that Walrus outperforms prior foundation models on both short and long term prediction horizons on downstream tasks and across the breadth of pretraining data, while ablation studies confirm the value of our contributions to forecast stability, training throughput, and transfer performance over conventional approaches. Code and weights are released for community use.
Problem

Research questions and friction points this paper is trying to address.

Addressing data heterogeneity and unstable dynamics in physical simulation learning
Overcoming varying resolutions and dimensionalities for efficient hardware training
Developing cross-domain foundation models for continuum dynamics prediction
Innovation

Methods, ideas, or system contributions that make the work stand out.

Harmonic-analysis-based stabilization for long-term dynamics
Load-balanced distributed 2D and 3D training strategies
Compute-adaptive tokenization for varying resolutions and dimensionalities
🔎 Similar Papers
Michael McCabe
Michael McCabe
Flatiron Institute
Machine learningcomputational scienceoptimizationnumerical analysis
P
Payel Mukhopadhyay
University of Cambridge
T
Tanya Marwah
Flatiron Institute
Bruno Régaldo-Saint Blancard
Bruno Régaldo-Saint Blancard
Flatiron Institute
F
Francois Rozet
Flatiron Institute, University of Liège
Cristiana Diaconu
Cristiana Diaconu
PhD Student, University of Cambridge
probabilistic machine learningdiffusion modelsneural processes
Lucas Meyer
Lucas Meyer
Flatiron Institute
K
Kaze W. K. Wong
Flatiron Institute
H
Hadi Sotoudeh
University of Cambridge
Alberto Bietti
Alberto Bietti
Flatiron Institute, Simons Foundation
machine learningoptimizationstatistics
I
Irina Espejo
Flatiron Institute, New York University
R
Rio Fear
University of Cambridge
Siavash Golkar
Siavash Golkar
Research Scientist, New York University
Machine LearningArtificial IntelligenceTheoretical Physics
T
Tom Hehir
University of Cambridge
Keiya Hirashima
Keiya Hirashima
RIEKN Center for Interdisciplinary Theoretical and Mathematical Sciences
Machine learningHPCGalaxy formation and evolution
G
G. Krawezik
Flatiron Institute
François Lanusse
François Lanusse
CNRS Researcher
observational cosmologydeep learning
R
Rudy Morel
Flatiron Institute
Ruben Ohana
Ruben Ohana
Senior Research Scientist, NVIDIA
Machine LearningAI for ScienceComputer VisionOptical Computing
L
L. Parker
Flatiron Institute
Mariel Pettee
Mariel Pettee
University of Wisconsin-Madison
Machine LearningHigh-Energy Particle PhysicsAstrophysics
J
Jeff Shen
Princeton University
Kyunghyun Cho
Kyunghyun Cho
New York University, Genentech
Machine LearningDeep Learning
Miles Cranmer
Miles Cranmer
University of Cambridge
Machine LearningAstrophysicsFluid Dynamics
Shirley Ho
Shirley Ho
Flatiron Institute, Center for Computational Astrophysics
CosmologyAstrophysicsMachine LearningStatistics