Thermalizer: Stable autoregressive neural emulation of spatiotemporal chaos

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Autoregressive neural surrogate models suffer from trajectory divergence in long-term forecasting of spatiotemporal chaotic systems due to error accumulation during rollout. To address this, we propose a stabilized autoregressive neural surrogate model: the first to employ diffusion models for implicit estimation of the score function of the invariant measure, coupled with an online thermalization mechanism that dynamically applies score-based denoising regularization during inference. Our approach integrates diffusion-driven implicit distribution modeling, autoregressive neural dynamics modeling, and real-time denoising control to suppress error propagation at its source. Experiments on turbulent flows and canonical chaotic systems—including the Kuramoto–Sivashinsky equation—demonstrate a tenfold improvement in stable prediction horizon, significantly enhancing both the applicability and reliability of neural surrogates for long-timescale forecasting.

Technology Category

Application Category

📝 Abstract
Autoregressive surrogate models (or extit{emulators}) of spatiotemporal systems provide an avenue for fast, approximate predictions, with broad applications across science and engineering. At inference time, however, these models are generally unable to provide predictions over long time rollouts due to accumulation of errors leading to diverging trajectories. In essence, emulators operate out of distribution, and controlling the online distribution quickly becomes intractable in large-scale settings. To address this fundamental issue, and focusing on time-stationary systems admitting an invariant measure, we leverage diffusion models to obtain an implicit estimator of the score of this invariant measure. We show that this model of the score function can be used to stabilize autoregressive emulator rollouts by applying on-the-fly denoising during inference, a process we call extit{thermalization}. Thermalizing an emulator rollout is shown to extend the time horizon of stable predictions by an order of magnitude in complex systems exhibiting turbulent and chaotic behavior, opening up a novel application of diffusion models in the context of neural emulation.
Problem

Research questions and friction points this paper is trying to address.

Stabilizing autoregressive emulators for long-term predictions
Controlling error accumulation in spatiotemporal chaos emulation
Extending stable prediction horizons with diffusion models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses diffusion models for score estimation
Applies on-the-fly denoising during inference
Stabilizes autoregressive emulator rollouts effectively
🔎 Similar Papers
No similar papers found.
C
Chris Pedersen
Courant Institute of Mathematical Sciences, New York University, USA; Center for Data Science, New York University, USA
L
Laure Zanno
Courant Institute of Mathematical Sciences, New York University, USA; Center for Data Science, New York University, USA
Joan Bruna
Joan Bruna
Professor of Computer Science, Data Science & Mathematics (aff), Courant Institute and CDS, NYU
Machine Learning