🤖 AI Summary
This work addresses the challenge of intractable intermediate posterior distributions in Bayesian inverse problems—where the likelihood is non-integrable or non-differentiable. We propose a gradient-free, task-agnostic inference framework that avoids retraining. Our key contribution is the first formulation of a mixture-distribution approximation for the intermediate posterior, coupled with an efficient Gibbs sampling–based optimization algorithm that jointly leverages pre-trained diffusion priors in both pixel and latent spaces—including models for images and audio. By decoupling inference from gradient-based variational methods or likelihood-dependent MCMC schemes, our approach enables robust posterior sampling without requiring analytic likelihood gradients or integrals. Experiments demonstrate substantial improvements in reconstruction quality across diverse inverse tasks: image super-resolution, denoising, inpainting, and audio source separation. The implementation is publicly available.
📝 Abstract
Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems. Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems, only leveraging inference-time compute and thereby eliminating the need to retrain task-specific models on the same dataset. To approximate the posterior of a Bayesian inverse problem, a diffusion model samples from a sequence of intermediate posterior distributions, each with an intractable likelihood function. This work proposes a novel mixture approximation of these intermediate distributions. Since direct gradient-based sampling of these mixtures is infeasible due to intractable terms, we propose a practical method based on Gibbs sampling. We validate our approach through extensive experiments on image inverse problems, utilizing both pixel- and latent-space diffusion priors, as well as on source separation with an audio diffusion model. The code is available at https://www.github.com/badr-moufad/mgdm