Analysis and Synthesis Denoisers for Forward-Backward Plug-and-Play Algorithms

📅 2024-11-20
🏛️ arXiv.org
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the convergence properties and implicit optimization objectives of Plug-and-Play (PnP) methods when analytical or synthesis Gaussian denoisers replace the proximal operator in forward–backward (FB) algorithms. Within a dictionary learning framework, we establish—rigorously for the first time—that FB-PnP yields identical solutions regardless of whether the internal denoising sub-iterations are performed once or to convergence. Specifically, synthesis denoisers naturally realize exact proximal mappings, while analysis denoisers, under a warm-restart strategy, become equivalent to primal–dual algorithms. Theoretically, FB-PnP convergence is shown to be independent of the number of inner denoising iterations; a unified analysis is provided via Moreau–Yosida regularization and dual-domain FB expansion. Experiments on compressed sensing and deep dictionary-based image restoration demonstrate both high reconstruction accuracy and strong algorithmic stability.

Technology Category

Application Category

📝 Abstract
In this work we study the behavior of the forward-backward (FB) algorithm when the proximity operator is replaced by a sub-iterative procedure to approximate a Gaussian denoiser, in a Plug-and-Play (PnP) fashion. In particular, we consider both analysis and synthesis Gaussian denoisers within a dictionary framework, obtained by unrolling dual-FB iterations or FB iterations, respectively. We analyze the associated minimization problems as well as the asymptotic behavior of the resulting FB-PnP iterations. In particular, we show that the synthesis Gaussian denoising problem can be viewed as a proximity operator. For each case, analysis and synthesis, we show that the FB-PnP algorithms solve the same problem whether we use only one or an infinite number of sub-iteration to solve the denoising problem at each iteration. To this aim, we show that each"one sub-iteration"strategy within the FB-PnP can be interpreted as a primal-dual algorithm when a warm-restart strategy is used. We further present similar results when using a Moreau-Yosida smoothing of the global problem, for an arbitrary number of sub-iterations. Finally, we provide numerical simulations to illustrate our theoretical results. In particular we first consider a toy compressive sensing example, as well as an image restoration problem in a deep dictionary framework.
Problem

Research questions and friction points this paper is trying to address.

Analyzing Plug-and-Play algorithms with Gaussian denoisers
Studying convergence of forward-backward iterations with sub-iterations
Comparing analysis versus synthesis denoising in dictionary frameworks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Forward-backward Plug-and-Play algorithm
Analysis and synthesis Gaussian denoisers
Primal-dual interpretation with warm-restart
🔎 Similar Papers
Matthieu Kowalski
Matthieu Kowalski
Universite Paris-Saclay
Signal ProcessingInverse ProblemsMachine Learning
B
Benoit Malézieux
Inria, Université Paris-Saclay, CEA, Palaiseau, France
Thomas Moreau
Thomas Moreau
Inria, CEA, Université Paris-Saclay
Machine learningTime seriesBi-level optimizationConvolutional Dictionary Learning
A
Audrey Repetti
School of Mathematics and Computer Sciences and School of Engineering and Physical Sciences, Heriot-Watt University, Edinburgh, UK; Maxwell Institute for Mathematical Sciences, Edinburgh, UK