A Mixture-Based Framework for Guiding Diffusion Models

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of intractable intermediate posterior distributions in Bayesian inverse problems—where the likelihood is non-integrable or non-differentiable. We propose a gradient-free, task-agnostic inference framework that avoids retraining. Our key contribution is the first formulation of a mixture-distribution approximation for the intermediate posterior, coupled with an efficient Gibbs sampling–based optimization algorithm that jointly leverages pre-trained diffusion priors in both pixel and latent spaces—including models for images and audio. By decoupling inference from gradient-based variational methods or likelihood-dependent MCMC schemes, our approach enables robust posterior sampling without requiring analytic likelihood gradients or integrals. Experiments demonstrate substantial improvements in reconstruction quality across diverse inverse tasks: image super-resolution, denoising, inpainting, and audio source separation. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems. Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems, only leveraging inference-time compute and thereby eliminating the need to retrain task-specific models on the same dataset. To approximate the posterior of a Bayesian inverse problem, a diffusion model samples from a sequence of intermediate posterior distributions, each with an intractable likelihood function. This work proposes a novel mixture approximation of these intermediate distributions. Since direct gradient-based sampling of these mixtures is infeasible due to intractable terms, we propose a practical method based on Gibbs sampling. We validate our approach through extensive experiments on image inverse problems, utilizing both pixel- and latent-space diffusion priors, as well as on source separation with an audio diffusion model. The code is available at https://www.github.com/badr-moufad/mgdm
Problem

Research questions and friction points this paper is trying to address.

Guiding diffusion models for Bayesian inverse problems
Approximating intermediate posterior distributions with mixtures
Proposing Gibbs sampling for intractable mixture sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixture approximation intermediate distributions
Gibbs sampling intractable terms
Pixel-latent space diffusion priors
🔎 Similar Papers
No similar papers found.