π€ AI Summary
This work addresses two key bottlenecks in autoregressive and diffusion-based solvers for time-dependent partial differential equations (PDEs): high inference overhead due to multi-step sampling, and poor adaptability of isotropic Gaussian noise to irregular geometric domains. We propose a generative latent-space neural solver. Methodologically, we first employ a mesh-agnostic autoencoder to map arbitrary unstructured inputs into a low-dimensional, structured latent space; then introduce flow matchingβa deterministic, single-step training and inference framework that supports sparse noise scheduling. To our knowledge, this is the first integration of flow matching into a latent-space PDE solver architecture. Evaluated on multiple benchmark PDE tasks, our method significantly outperforms state-of-the-art deterministic solvers: it improves long-horizon prediction accuracy and stability, accelerates inference by over 3Γ, and achieves superior computational efficiency, geometric generalizability, and physical consistency.
π Abstract
Autoregressive next-step prediction models have become the de-facto standard for building data-driven neural solvers to forecast time-dependent partial differential equations (PDEs). Denoise training that is closely related to diffusion probabilistic model has been shown to enhance the temporal stability of neural solvers, while its stochastic inference mechanism enables ensemble predictions and uncertainty quantification. In principle, such training involves sampling a series of discretized diffusion timesteps during both training and inference, inevitably increasing computational overhead. In addition, most diffusion models apply isotropic Gaussian noise on structured, uniform grids, limiting their adaptability to irregular domains. We propose a latent diffusion model for PDE simulation that embeds the PDE state in a lower-dimensional latent space, which significantly reduces computational costs. Our framework uses an autoencoder to map different types of meshes onto a unified structured latent grid, capturing complex geometries. By analyzing common diffusion paths, we propose to use a coarsely sampled noise schedule from flow matching for both training and testing. Numerical experiments show that the proposed model outperforms several deterministic baselines in both accuracy and long-term stability, highlighting the potential of diffusion-based approaches for robust data-driven PDE learning.