Project and Generate: Divergence-Free Neural Operators for Incompressible Flows

📅 2026-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of unphysical and unstable simulations in existing learning-based fluid dynamics models, which often fail to strictly satisfy the incompressible continuity equation due to the absence of hard physical constraints. The authors propose a unified framework that enforces incompressibility by hard-coding the divergence-free constraint, enabling both deterministic and generative models to operate exclusively within the solenoidal subspace. The key innovation lies in the first-time integration of a differentiable spectral Leray projector—derived from Helmholtz–Hodge decomposition—with a curl-driven Gaussian reference measure, ensuring that both model outputs and generated distributions rigorously adhere to zero divergence. Experiments on the two-dimensional Navier–Stokes equations demonstrate exact incompressibility up to discretization error, significantly enhancing long-term simulation stability and physical fidelity.

Technology Category

Application Category

📝 Abstract
Learning-based models for fluid dynamics often operate in unconstrained function spaces, leading to physically inadmissible, unstable simulations. While penalty-based methods offer soft regularization, they provide no structural guarantees, resulting in spurious divergence and long-term collapse. In this work, we introduce a unified framework that enforces the incompressible continuity equation as a hard, intrinsic constraint for both deterministic and generative modeling. First, to project deterministic models onto the divergence-free subspace, we integrate a differentiable spectral Leray projection grounded in the Helmholtz-Hodge decomposition, which restricts the regression hypothesis space to physically admissible velocity fields. Second, to generate physically consistent distributions, we show that simply projecting model outputs is insufficient when the prior is incompatible. To address this, we construct a divergence-free Gaussian reference measure via a curl-based pushforward, ensuring the entire probability flow remains subspace-consistent by construction. Experiments on 2D Navier-Stokes equations demonstrate exact incompressibility up to discretization error and substantially improved stability and physical consistency.
Problem

Research questions and friction points this paper is trying to address.

incompressible flows
divergence-free
neural operators
physical consistency
fluid dynamics
Innovation

Methods, ideas, or system contributions that make the work stand out.

divergence-free
neural operators
Leray projection
Helmholtz-Hodge decomposition
generative modeling
🔎 Similar Papers
2024-02-19Neural Information Processing SystemsCitations: 8
Xigui Li
Xigui Li
Fudan University
AI for ScienceScientific Machine Learning
Hongwei Zhang
Hongwei Zhang
Fudan University
GraphsAI4SMachine learning
R
Ruoxi Jiang
Artificial Intelligence Innovation and Incubation Institute, Fudan University, Shanghai, China; Shanghai Academy of Artificial Intelligence for Science, Shanghai, China
D
Deshu Chen
Artificial Intelligence Innovation and Incubation Institute, Fudan University, Shanghai, China; Shanghai Academy of Artificial Intelligence for Science, Shanghai, China
Chensen Lin
Chensen Lin
Fudan University
L
Limei Han
Artificial Intelligence Innovation and Incubation Institute, Fudan University, Shanghai, China; Shanghai Academy of Artificial Intelligence for Science, Shanghai, China
Y
Yuan Qi
Artificial Intelligence Innovation and Incubation Institute, Fudan University, Shanghai, China; Shanghai Academy of Artificial Intelligence for Science, Shanghai, China
Xin Guo
Xin Guo
Fudan University
NLP
Yuan Cheng
Yuan Cheng
University of Nottingham Ningbo China
SecurityAccess ControlAuthentication