Gaussian is All You Need: A Unified Framework for Solving Inverse Problems via Diffusion Posterior Sampling

📅 2024-09-13
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing diffusion-based methods for image inverse problems suffer from inaccurate likelihood approximation or computational inefficiency. This paper proposes a unified posterior sampling framework: it approximates the observation model with a Gaussian likelihood and introduces a novel covariance correction term to enhance approximation fidelity. Crucially, the method avoids backpropagating gradients through the diffusion model, substantially reducing computational overhead. By leveraging efficient matrix decomposition techniques—such as Cholesky decomposition—it enables rapid covariance inversion and robust data-consistency enforcement across diverse inverse problems. Evaluated on super-resolution, denoising, and compressed sensing, the approach achieves more stable convergence, superior reconstruction quality, and significantly faster inference, consistently outperforming state-of-the-art diffusion-based methods.

Technology Category

Application Category

📝 Abstract
Diffusion models can generate a variety of high-quality images by modeling complex data distributions. Trained diffusion models can also be very effective image priors for solving inverse problems. Most of the existing diffusion-based methods integrate data consistency steps within the diffusion reverse sampling process. The data consistency steps rely on an approximate likelihood function. In this paper, we show that the existing approximations are either insufficient or computationally inefficient. To address these issues, we propose a unified likelihood approximation method that incorporates a covariance correction term to enhance the performance and avoids propagating gradients through the diffusion model. The correction term, when integrated into the reverse diffusion sampling process, achieves better convergence towards the true data posterior for selected distributions and improves performance on real-world natural image datasets. Furthermore, we present an efficient way to factorize and invert the covariance matrix of the likelihood function for several inverse problems. We present comprehensive experiments to demonstrate the effectiveness of our method over several existing approaches.
Problem

Research questions and friction points this paper is trying to address.

Improving likelihood approximation in diffusion-based inverse problems
Enhancing convergence to true data posterior with covariance correction
Efficient covariance matrix factorization for various inverse problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified likelihood approximation with covariance correction
Avoids gradient propagation through diffusion model
Efficient covariance matrix factorization for inverse problems
🔎 Similar Papers
No similar papers found.
N
Nebiyou Yismaw
University of California Riverside
U
U. Kamilov
Washington University in St. Louis
M
M. S. Asif
University of California Riverside