🤖 AI Summary
This work addresses the challenge of inferring physical fields from sparse observations while strictly enforcing partial differential equation (PDE) constraints and preserving the statistical priors encoded in pretrained generative models. To this end, the authors propose ProFlow, a framework that enables zero-shot physically consistent sampling without fine-tuning the generative model, by alternately performing proximal optimization and manifold interpolation. ProFlow is the first method to achieve a coherent integration of hard PDE constraints, observation fidelity, and generative priors, and it admits a Bayesian interpretation as a local maximum a posteriori (MAP) estimator. Experiments on Poisson, Helmholtz, Darcy, and viscous Burgers equations demonstrate that ProFlow significantly outperforms existing diffusion and flow-based models in terms of physical consistency, data fidelity, and accuracy in capturing distributional statistics.
📝 Abstract
Inferring physical fields from sparse observations while strictly satisfying partial differential equations (PDEs) is a fundamental challenge in computational physics. Recently, deep generative models offer powerful data-driven priors for such inverse problems, yet existing methods struggle to enforce hard physical constraints without costly retraining or disrupting the learned generative prior. Consequently, there is a critical need for a sampling mechanism that can reconcile strict physical consistency and observational fidelity with the statistical structure of the pre-trained prior. To this end, we present ProFlow, a proximal guidance framework for zero-shot physics-consistent sampling, defined as inferring solutions from sparse observations using a fixed generative prior without task-specific retraining. The algorithm employs a rigorous two-step scheme that alternates between: (\romannumeral1) a terminal optimization step, which projects the flow prediction onto the intersection of the physically and observationally consistent sets via proximal minimization; and (\romannumeral2) an interpolation step, which maps the refined state back to the generative trajectory to maintain consistency with the learned flow probability path. This procedure admits a Bayesian interpretation as a sequence of local maximum a posteriori (MAP) updates. Comprehensive benchmarks on Poisson, Helmholtz, Darcy, and viscous Burgers'equations demonstrate that ProFlow achieves superior physical and observational consistency, as well as more accurate distributional statistics, compared to state-of-the-art diffusion- and flow-based baselines.