🤖 AI Summary
Traditional convex regularizers (e.g., total variation, TV) suffer from limited sparsity modeling capacity in image denoising, often leading to oversmoothing. Method: This paper proposes a non-convex graph total variation (NC-GTV) regularizer. It constructs a non-convex penalty via a graph-based Huber function and employs the Gershgorin Circle Theorem to adaptively determine a convexity-guaranteeing parameter—ensuring the joint objective (ℓ₂ data fidelity + NC-GTV) is strictly convex and free of spurious local minima. Furthermore, the ADMM optimization is unrolled into a lightweight, learnable network. Contribution/Results: Extensive experiments demonstrate that the proposed method significantly outperforms both unfolded GTV and state-of-the-art denoising algorithms on standard benchmarks. It reduces model parameters by over 40%, achieves faster inference, and exhibits stable convergence.
📝 Abstract
Conventional model-based image denoising optimizations employ convex regularization terms, such as total variation (TV) that convexifies the $ell_0$-norm to promote sparse signal representation. Instead, we propose a new non-convex total variation term in a graph setting (NC-GTV), such that when combined with an $ell_2$-norm fidelity term for denoising, leads to a convex objective with no extraneous local minima. We define NC-GTV using a new graph variant of the Huber function, interpretable as a Moreau envelope. The crux is the selection of a parameter $a$ characterizing the graph Huber function that ensures overall objective convexity; we efficiently compute $a$ via an adaptation of Gershgorin Circle Theorem (GCT). To minimize the convex objective, we design a linear-time algorithm based on Alternating Direction Method of Multipliers (ADMM) and unroll it into a lightweight feed-forward network for data-driven parameter learning. Experiments show that our method outperforms unrolled GTV and other representative image denoising schemes, while employing far fewer network parameters.