Towards Infinitely Long Neural Simulations: Self-Refining Neural Surrogate Models for Dynamical Systems

📅 2026-03-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of distribution drift in autoregressive neural surrogate models during long-term dynamical system simulation, which arises from error accumulation and undermines long-term consistency. To this end, the authors propose a unified mathematical framework that formally characterizes, for the first time, the inherent trade-off between short-term accuracy and long-term consistency. Building upon this framework, they introduce Self-Refining Neural Surrogates (SNS)—a novel class of surrogate models that require no hyperparameter tuning. SNS leverages a conditional diffusion mechanism to iteratively refine its own or existing surrogate outputs, enabling high-fidelity simulation over arbitrarily long time horizons. Experimental results demonstrate that SNS substantially improves both long-term stability and simulation accuracy, overcoming the limitations of conventional approaches that rely on empirical hyperparameter tuning.

Technology Category

Application Category

📝 Abstract
Recent advances in autoregressive neural surrogate models have enabled orders-of-magnitude speedups in simulating dynamical systems. However, autoregressive models are generally prone to distribution drift: compounding errors in autoregressive rollouts that severely degrade generation quality over long time horizons. Existing work attempts to address this issue by implicitly leveraging the inherent trade-off between short-time accuracy and long-time consistency through hyperparameter tuning. In this work, we introduce a unifying mathematical framework that makes this tradeoff explicit, formalizing and generalizing hyperparameter-based strategies in existing approaches. Within this framework, we propose a robust, hyperparameter-free model implemented as a conditional diffusion model that balances short-time fidelity with long-time consistency by construction. Our model, Self-refining Neural Surrogate model (SNS), can be implemented as a standalone model that refines its own autoregressive outputs or as a complementary model to existing neural surrogates to ensure long-time consistency. We also demonstrate the numerical feasibility of SNS through high-fidelity simulations of complex dynamical systems over arbitrarily long time horizons.
Problem

Research questions and friction points this paper is trying to address.

distribution drift
autoregressive models
dynamical systems
long-time consistency
neural surrogate models
Innovation

Methods, ideas, or system contributions that make the work stand out.

self-refining
neural surrogate
dynamical systems
conditional diffusion model
long-time consistency
🔎 Similar Papers
No similar papers found.