🤖 AI Summary
Generative models often struggle to strictly satisfy physical constraints—such as partial differential equations (PDEs)—leading to physically inconsistent predictions.
Method: This paper proposes embedding differentiable physical priors directly into the training objective of diffusion models. Specifically, it introduces, for the first time, a PDE-residual-based gradient-weighted constraint term within the denoising loss, enabling unified incorporation of both equality and inequality constraints alongside implicit regularization. The approach jointly optimizes diffusion modeling, physics-informed neural networks (PINNs), residual minimization, and structural optimization.
Contribution/Results: On fluid flow tasks, the method reduces PDE residuals by two orders of magnitude; in topology optimization, it outperforms dedicated frameworks. It significantly mitigates overfitting and demonstrates strong generalization across diverse physical domains.
📝 Abstract
Generative models such as denoising diffusion models are quickly advancing their ability to approximate highly complex data distributions. They are also increasingly leveraged in scientific machine learning, where samples from the implied data distribution are expected to adhere to specific governing equations. We present a framework that unifies generative modeling and partial differential equation fulfillment by introducing a first-principle-based loss term that enforces generated samples to fulfill the underlying physical constraints. Our approach reduces the residual error by up to two orders of magnitude compared to previous work in a fluid flow case study and outperforms task-specific frameworks in relevant metrics for structural topology optimization. We also present numerical evidence that our extended training objective acts as a natural regularization mechanism against overfitting. Our framework is simple to implement and versatile in its applicability for imposing equality and inequality constraints as well as auxiliary optimization objectives.