🤖 AI Summary
This work addresses the implicit regularization structure and convergence properties of plug-and-play (PnP) methods employing minimum mean squared error (MMSE) denoisers. We establish, for the first time, that MMSE-PnP is equivalent to explicit optimization with a 1-weakly convex regularizer given by the upper Moreau envelope of the negative log marginal density. Building on this insight, we derive the first non-asymptotic sublinear convergence rate for PnP gradient descent—removing restrictive asymptotic or strong convexity assumptions required in prior analyses. Our approach integrates weak convexity theory, proximal operator analysis, and implicit regularization modeling. We validate the theoretical convergence behavior and practical efficacy through synthetic 1D experiments and real-world inverse problems, including image deblurring and CT reconstruction. Key contributions include: (i) characterizing the precise implicit regularizer induced by MMSE denoisers; (ii) providing the first non-asymptotic convergence guarantee for PnP gradient descent; and (iii) demonstrating tight alignment between theory and empirical performance.
📝 Abstract
It is known that the minimum-mean-squared-error (MMSE) denoiser under Gaussian noise can be written as a proximal operator, which suffices for asymptotic convergence of plug-and-play (PnP) methods but does not reveal the structure of the induced regularizer or give convergence rates. We show that the MMSE denoiser corresponds to a regularizer that can be written explicitly as an upper Moreau envelope of the negative log-marginal density, which in turn implies that the regularizer is 1-weakly convex. Using this property, we derive (to the best of our knowledge) the first sublinear convergence guarantee for PnP proximal gradient descent with an MMSE denoiser. We validate the theory with a one-dimensional synthetic study that recovers the implicit regularizer. We also validate the theory with imaging experiments (deblurring and computed tomography), which exhibit the predicted sublinear behavior.