Guided Diffusion Sampling on Function Spaces with Applications to PDEs

📅 2025-05-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of reconstructing full solutions to partial differential equation (PDE) inverse problems under extremely sparse observations (only 3% data) or strong noise, this paper proposes the first discretization-invariant diffusion sampling framework operating directly in function space. Methodologically, it integrates neural operators for operator learning, function-space denoising, gradient-based plug-and-play guidance, and multi-resolution fine-tuning. Key contributions include: (i) the first extension of diffusion models to infinite-dimensional Hilbert spaces; (ii) a rigorous generalization of the Tweedie formula to function space, enabling theoretically grounded denoising estimation; and (iii) a mesh-free, multi-resolution-agnostic conditional sampling mechanism. Evaluated on five canonical PDE inverse tasks, the method achieves an average 32% improvement in reconstruction accuracy and reduces sampling steps by 4× compared to fixed-resolution diffusion baselines.

Technology Category

Application Category

📝 Abstract
We propose a general framework for conditional sampling in PDE-based inverse problems, targeting the recovery of whole solutions from extremely sparse or noisy measurements. This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning. Our method first trains an unconditional discretization-agnostic denoising model using neural operator architectures. At inference, we refine the samples to satisfy sparse observation data via a gradient-based guidance mechanism. Through rigorous mathematical analysis, we extend Tweedie's formula to infinite-dimensional Hilbert spaces, providing the theoretical foundation for our posterior sampling approach. Our method (FunDPS) accurately captures posterior distributions in function spaces under minimal supervision and severe data scarcity. Across five PDE tasks with only 3% observation, our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines while reducing sampling steps by 4x. Furthermore, multi-resolution fine-tuning ensures strong cross-resolution generalizability. To the best of our knowledge, this is the first diffusion-based framework to operate independently of discretization, offering a practical and flexible solution for forward and inverse problems in the context of PDEs. Code is available at https://github.com/neuraloperator/FunDPS
Problem

Research questions and friction points this paper is trying to address.

Recovering PDE solutions from sparse/noisy measurements
Extending diffusion models to function spaces
Achieving discretization-agnostic inverse problem solving
Innovation

Methods, ideas, or system contributions that make the work stand out.

Function-space diffusion model for PDE solutions
Plug-and-play gradient-based guidance mechanism
Discretization-agnostic neural operator architectures
🔎 Similar Papers
No similar papers found.