Memory-Conditioned Flow-Matching for Stable Autoregressive PDE Rollouts

📅 2026-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the drift and instability observed in autoregressive partial differential equation (PDE) solvers during long-horizon rollouts, particularly under coarse-to-fine generation, which stems from neglecting unresolved scales. Building upon the Mori–Zwanzig projection formalism, the authors propose a memory-augmented flow matching method that injects compact latent features online to construct a structured conditional prior over unresolved scales. This approach overcomes the structural limitations of memoryless closure models by enabling, for the first time, memory-conditioned generative modeling. The method is rigorously shown to guarantee Wasserstein stability and admits a discrete Grönwall bound on rollout error. Experiments on compressible flows with shocks and multiscale hybrid problems demonstrate substantial improvements in long-term prediction accuracy, spectral fidelity, and statistical consistency.

Technology Category

Application Category

📝 Abstract
Autoregressive generative PDE solvers can be accurate one step ahead yet drift over long rollouts, especially in coarse-to-fine regimes where each step must regenerate unresolved fine scales. This is the regime of diffusion and flow-matching generators: although their internal dynamics are Markovian, rollout stability is governed by per-step \emph{conditional law} errors. Using the Mori--Zwanzig projection formalism, we show that eliminating unresolved variables yields an exact resolved evolution with a Markov term, a memory term, and an orthogonal forcing, exposing a structural limitation of memoryless closures. Motivated by this, we introduce memory-conditioned diffusion/flow-matching with a compact online state injected into denoising via latent features. Via disintegration, memory induces a structured conditional tail prior for unresolved scales and reduces the transport needed to populate missing frequencies. We prove Wasserstein stability of the resulting conditional kernel. We then derive discrete Gr\"onwall rollout bounds that separate memory approximation from conditional generation error. Experiments on compressible flows with shocks and multiscale mixing show improved accuracy and markedly more stable long-horizon rollouts, with better fine-scale spectral and statistical fidelity.
Problem

Research questions and friction points this paper is trying to address.

autoregressive PDE solvers
rollout stability
coarse-to-fine regimes
conditional law errors
unresolved scales
Innovation

Methods, ideas, or system contributions that make the work stand out.

memory-conditioned flow-matching
Mori–Zwanzig formalism
autoregressive PDE solvers
conditional generative modeling
Wasserstein stability
🔎 Similar Papers
No similar papers found.