🤖 AI Summary
Inferring the vector field of ordinary differential equations (ODEs) from noisy trajectory data typically relies on complex training procedures or extensive domain expertise. This work proposes FIM-ODE, the first method to introduce the foundation model paradigm to ODE inference, leveraging neural operators to enable a single forward pass prediction of the vector field. During pretraining, FIM-ODE uses only low-order polynomial priors, yet supports amortized inference for low-dimensional ODEs. In zero-shot settings, it matches or even surpasses the performance of state-of-the-art symbolic regression methods such as ODEFormer. After fine-tuning, it significantly outperforms baseline approaches—including neural ODEs and Gaussian processes—while offering faster, more stable training and substantially reducing reliance on expert knowledge.
📝 Abstract
Ordinary differential equations (ODEs) are central to scientific modelling, but inferring their vector fields from noisy trajectories remains challenging. Current approaches such as symbolic regression, Gaussian process (GP) regression, and Neural ODEs often require complex training pipelines and substantial machine learning expertise, or they depend strongly on system-specific prior knowledge. We propose FIM-ODE, a pretrained Foundation Inference Model that amortises low-dimensional ODE inference by predicting the vector field directly from noisy trajectory data in a single forward pass. We pretrain FIM-ODE on a prior distribution over ODEs with low-degree polynomial vector fields and represent the target field with neural operators. FIM-ODE achieves strong zero-shot performance, matching and often improving upon ODEFormer, a recent pretrained symbolic baseline, across a range of regimes despite using a simpler pretraining prior distribution. Pretraining also provides a strong initialisation for finetuning, enabling fast and stable adaptation that outperforms modern neural and GP baselines without requiring machine learning expertise.