A Primal-dual algorithm for image reconstruction with ICNNs

📅 2024-10-16
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Optimization of ICNN-regularized variational image reconstruction is hindered by nonconvexity, nonsmoothness, and the nested structure inherent in ICNNs. Method: This paper proposes a convex-equivalent reformulation that rigorously transforms the original nonconvex, nonsmooth problem into a tractable convex optimization problem. The key innovation lies in leveraging epigraph projections of activation functions to eliminate the nested structure of ICNNs, with theoretical proof of exact equivalence. An efficient primal-dual algorithm-based solver is then developed for the reformulated problem. Results: Experiments across multiple image reconstruction tasks demonstrate that the proposed method significantly outperforms conventional subgradient methods: it achieves markedly faster convergence, improved iteration stability, and—crucially—enables the first provably convergent convex optimization framework for ICNN-regularized models.

Technology Category

Application Category

📝 Abstract
We address the optimization problem in a data-driven variational reconstruction framework, where the regularizer is parameterized by an input-convex neural network (ICNN). While gradient-based methods are commonly used to solve such problems, they struggle to effectively handle non-smoothness which often leads to slow convergence. Moreover, the nested structure of the neural network complicates the application of standard non-smooth optimization techniques, such as proximal algorithms. To overcome these challenges, we reformulate the problem and eliminate the network's nested structure. By relating this reformulation to epigraphical projections of the activation functions, we transform the problem into a convex optimization problem that can be efficiently solved using a primal-dual algorithm. We also prove that this reformulation is equivalent to the original variational problem. Through experiments on several imaging tasks, we demonstrate that the proposed approach outperforms subgradient methods in terms of both speed and stability.
Problem

Research questions and friction points this paper is trying to address.

Optimizing image reconstruction with non-smooth neural network regularizers
Overcoming slow convergence of gradient methods in variational frameworks
Handling nested network structure complicating standard optimization techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reformulating problem to eliminate nested network structure
Using epigraphical projections for convex optimization transformation
Employing primal-dual algorithm for efficient reconstruction solution
🔎 Similar Papers
No similar papers found.
H
Hok Shing Wong
Department of Mathematical Sciences, University of Bath, Bath BA2 7JU, UK
M
Matthias Joachim Ehrhardt
Department of Mathematical Sciences, University of Bath, Bath BA2 7JU, UK
Subhadip Mukherjee
Subhadip Mukherjee
Assistant Professor, Department of E&ECE, IIT Kharagpur, India
Machine LearningInverse Problems in ImagingOptimization