🤖 AI Summary
This work addresses nonlinear inverse problems in image processing by proposing the first end-to-end monotone neural operator learning framework with provable convergence. Instead of relying on pre-specified Lipschitz constants or plug-and-play architectures, the method formulates the inverse problem as a monotone variational inclusion and introduces a monotonicity-constrained loss function. Training and inference are performed via the Forward-Backward-Forward (FBF) algorithm—a fixed-point iteration that requires no Lipschitz continuity prior. Theoretically, the framework guarantees strict convergence of the iterative scheme under mild assumptions. Experimentally, it achieves significant improvements in reconstruction accuracy and stability on deblurring and super-resolution tasks. Notably, this is the first approach to unify provably monotone approximation with end-to-end joint optimization, bridging theoretical guarantees with practical deep learning performance.
📝 Abstract
This article introduces a novel approach to learning monotone neural networks through a newly defined penalization loss. The proposed method is particularly effective in solving classes of variational problems, specifically monotone inclusion problems, commonly encountered in image processing tasks. The Forward-Backward-Forward (FBF) algorithm is employed to address these problems, offering a solution even when the Lipschitz constant of the neural network is unknown. Notably, the FBF algorithm provides convergence guarantees under the condition that the learned operator is monotone. Building on plug-and-play methodologies, our objective is to apply these newly learned operators to solving non-linear inverse problems. To achieve this, we initially formulate the problem as a variational inclusion problem. Subsequently, we train a monotone neural network to approximate an operator that may not inherently be monotone. Leveraging the FBF algorithm, we then show simulation examples where the non-linear inverse problem is successfully solved.