🤖 AI Summary
This work addresses the challenge of efficiently and accurately performing conditional sampling π(x|y) in unconditional diffusion models, without introducing additional model approximation errors. The proposed method formulates conditional simulation as a partial stochastic differential equation (SDE) bridge inference problem on an augmented state space, and introduces the first unified framework integrating particle Gibbs samplers with pseudo-marginal sampling—both grounded in exact SDE bridges. By relying solely on Monte Carlo error and avoiding approximations such as posterior drift correction, the approach ensures strict Bayesian consistency and eliminates bias from heuristic corrections. Experiments on both synthetic and real-world datasets demonstrate that the method achieves superior sample fidelity and theoretical coherence compared to existing conditional diffusion techniques. It establishes a new paradigm for rigorous, principled conditional inference in diffusion models.
📝 Abstract
Given an unconditional diffusion model targeting a joint model $pi(x, y)$, using it to perform conditional simulation $pi(x mid y)$ is still largely an open question and is typically achieved by learning conditional drifts to the denoising SDE after the fact. In this work, we express emph{exact} conditional simulation within the emph{approximate} diffusion model as an inference problem on an augmented space corresponding to a partial SDE bridge. This perspective allows us to implement efficient and principled particle Gibbs and pseudo-marginal samplers marginally targeting the conditional distribution $pi(x mid y)$. Contrary to existing methodology, our methods do not introduce any additional approximation to the unconditional diffusion model aside from the Monte Carlo error. We showcase the benefits and drawbacks of our approach on a series of synthetic and real data examples.