Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling

📅 2026-02-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of conventional diffusion models in stochastic closure modeling—namely, slow sampling and the trade-off between physical fidelity and data efficiency—by introducing a transport-based generative approach that enables single-step fast sampling through flow matching on a low-dimensional latent manifold. By combining explicit geometric regularization (e.g., metric-preserving or geometry-aware constraints) with implicit regularization via joint training, the method effectively controls latent space deformation to preserve physical consistency. With only limited training data, it accurately reconstructs the topological structure and physical properties of the underlying dynamical system, achieving sampling speeds two orders of magnitude faster than iterative diffusion models.

Technology Category

Application Category

📝 Abstract
Diffusion models recently developed for generative AI tasks can produce high-quality samples while still maintaining diversity among samples to promote mode coverage, providing a promising path for learning stochastic closure models. Compared to other types of generative AI models, such as GANs and VAEs, the sampling speed is known as a key disadvantage of diffusion models. By systematically comparing transport-based generative models on a numerical example of 2D Kolmogorov flows, we show that flow matching in a lower-dimensional latent space is suited for fast sampling of stochastic closure models, enabling single-step sampling that is up to two orders of magnitude faster than iterative diffusion-based approaches. To control the latent space distortion and thus ensure the physical fidelity of the sampled closure term, we compare the implicit regularization offered by a joint training scheme against two explicit regularizers: metric-preserving (MP) and geometry-aware (GA) constraints. Besides offering a faster sampling speed, both explicitly and implicitly regularized latent spaces inherit the key topological information from the lower-dimensional manifold of the original complex dynamical system, which enables the learning of stochastic closure models without demanding a huge amount of training data.
Problem

Research questions and friction points this paper is trying to address.

stochastic closure modeling
transport-based generative models
latent geometry
sampling efficiency
physical fidelity
Innovation

Methods, ideas, or system contributions that make the work stand out.

transport-based generative models
latent geometry
flow matching
stochastic closure modeling
single-step sampling
🔎 Similar Papers
No similar papers found.