🤖 AI Summary
This work addresses the limitations of existing deep unfolding networks for image restoration, which suffer from uniform denoising objectives across stages, high parameter redundancy, and substantial memory consumption. To overcome these issues, the authors propose LoRun, a novel framework that introduces low-rank adaptation (LoRA) into deep unfolding architectures for the first time. LoRun shares a pre-trained base denoiser across all unfolding stages while incorporating lightweight LoRA adapters to enable stage-adaptive modulation of denoising strength. This design significantly reduces model redundancy and achieves performance comparable to or better than baseline methods across three image restoration tasks. Moreover, it attains up to an N-fold reduction in trainable parameters—where N denotes the number of unfolding stages—thereby offering an efficient yet effective solution that balances computational economy with restoration quality.
📝 Abstract
Deep unfolding networks (DUNs), combining conventional iterative optimization algorithms and deep neural networks into a multi-stage framework, have achieved remarkable accomplishments in Image Restoration (IR), such as spectral imaging reconstruction, compressive sensing and super-resolution.It unfolds the iterative optimization steps into a stack of sequentially linked blocks.Each block consists of a Gradient Descent Module (GDM) and a Proximal Mapping Module (PMM) which is equivalent to a denoiser from a Bayesian perspective, operating on Gaussian noise with a known level.However, existing DUNs suffer from two critical limitations: (i) their PMMs share identical architectures and denoising objectives across stages, ignoring the need for stage-specific adaptation to varying noise levels; and (ii) their chain of structurally repetitive blocks results in severe parameter redundancy and high memory consumption, hindering deployment in large-scale or resource-constrained scenarios.To address these challenges, we introduce generalized Deep Low-rank Adaptation (LoRA) Unfolding Networks for image restoration, named LoRun, harmonizing denoising objectives and adapting different denoising levels between stages with compressed memory usage for more efficient DUN.LoRun introduces a novel paradigm where a single pretrained base denoiser is shared across all stages, while lightweight, stage-specific LoRA adapters are injected into the PMMs to dynamically modulate denoising behavior according to the noise level at each unfolding step.This design decouples the core restoration capability from task-specific adaptation, enabling precise control over denoising intensity without duplicating full network parameters and achieving up to $N$ times parameter reduction for an $N$-stage DUN with on-par or better performance.Extensive experiments conducted on three IR tasks validate the efficiency of our method.