MAP Estimation with Denoisers: Convergence Rates and Guarantees

📅 2025-07-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Maximum a posteriori (MAP) estimation in inverse problems often replaces the analytically intractable proximal operator with a pre-trained denoiser—a widely adopted yet theoretically ungrounded heuristic. Method: We propose a structurally simple, algorithmically faithful framework for MAP inference. Under the reasonable assumption that the prior is log-concave, we prove global convergence of the proposed algorithm and establish its equivalence to gradient descent on a smoothed proximal objective. Contribution/Results: This work provides the first rigorous convergence guarantee for denoiser-based proximal operator approximation—bridging, for the first time at the theoretical level, empirical denoising priors (e.g., DnCNN, DDPM) with classical optimization paradigms. It establishes a sound optimization foundation for mainstream methodologies including Denoising-based Compressed Sensing (DnC) and score-matching-driven inverse problem solving.

Technology Category

Application Category

📝 Abstract
Denoiser models have become powerful tools for inverse problems, enabling the use of pretrained networks to approximate the score of a smoothed prior distribution. These models are often used in heuristic iterative schemes aimed at solving Maximum a Posteriori (MAP) optimisation problems, where the proximal operator of the negative log-prior plays a central role. In practice, this operator is intractable, and practitioners plug in a pretrained denoiser as a surrogate-despite the lack of general theoretical justification for this substitution. In this work, we show that a simple algorithm, closely related to several used in practice, provably converges to the proximal operator under a log-concavity assumption on the prior $p$. We show that this algorithm can be interpreted as a gradient descent on smoothed proximal objectives. Our analysis thus provides a theoretical foundation for a class of empirically successful but previously heuristic methods.
Problem

Research questions and friction points this paper is trying to address.

Theoretical justification for using denoisers in MAP estimation
Convergence analysis of algorithms replacing proximal operators with denoisers
Establishing gradient descent interpretation for empirical denoising-based methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Denoisers approximate score of smoothed prior
Algorithm converges to proximal operator
Gradient descent on smoothed proximal objectives
🔎 Similar Papers
No similar papers found.