🤖 AI Summary
This work investigates the convergence properties and implicit optimization objectives of Plug-and-Play (PnP) methods when analytical or synthesis Gaussian denoisers replace the proximal operator in forward–backward (FB) algorithms. Within a dictionary learning framework, we establish—rigorously for the first time—that FB-PnP yields identical solutions regardless of whether the internal denoising sub-iterations are performed once or to convergence. Specifically, synthesis denoisers naturally realize exact proximal mappings, while analysis denoisers, under a warm-restart strategy, become equivalent to primal–dual algorithms. Theoretically, FB-PnP convergence is shown to be independent of the number of inner denoising iterations; a unified analysis is provided via Moreau–Yosida regularization and dual-domain FB expansion. Experiments on compressed sensing and deep dictionary-based image restoration demonstrate both high reconstruction accuracy and strong algorithmic stability.
📝 Abstract
In this work we study the behavior of the forward-backward (FB) algorithm when the proximity operator is replaced by a sub-iterative procedure to approximate a Gaussian denoiser, in a Plug-and-Play (PnP) fashion. In particular, we consider both analysis and synthesis Gaussian denoisers within a dictionary framework, obtained by unrolling dual-FB iterations or FB iterations, respectively. We analyze the associated minimization problems as well as the asymptotic behavior of the resulting FB-PnP iterations. In particular, we show that the synthesis Gaussian denoising problem can be viewed as a proximity operator. For each case, analysis and synthesis, we show that the FB-PnP algorithms solve the same problem whether we use only one or an infinite number of sub-iteration to solve the denoising problem at each iteration. To this aim, we show that each"one sub-iteration"strategy within the FB-PnP can be interpreted as a primal-dual algorithm when a warm-restart strategy is used. We further present similar results when using a Moreau-Yosida smoothing of the global problem, for an arbitrary number of sub-iterations. Finally, we provide numerical simulations to illustrate our theoretical results. In particular we first consider a toy compressive sensing example, as well as an image restoration problem in a deep dictionary framework.