Learning to accelerate Krasnosel'skii-Mann fixed-point iterations with guarantees

📅 2026-01-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a learnable Krasnosel’skii–Mann iteration framework for solving fixed-point problems involving nonexpansive mappings, aiming to accelerate convergence in the average case while preserving theoretical convergence guarantees. By introducing learnable perturbations that satisfy a summability condition, the method achieves locally linear convergence with vanishing bias under a metric subregularity assumption. As the first approach to integrate learning-to-optimize (L2O) principles into fixed-point iterations, the framework is compatible with any iterative scheme exhibiting rapid local convergence. It has been successfully applied to operator splitting algorithms such as Douglas–Rachford, demonstrating significant acceleration in solving structured monotone inclusions and best approximation problems.

Technology Category

Application Category

📝 Abstract
We introduce a principled learning to optimize (L2O) framework for solving fixed-point problems involving general nonexpansive mappings. Our idea is to deliberately inject summable perturbations into a standard Krasnosel'skii-Mann iteration to improve its average-case performance over a specific distribution of problems while retaining its convergence guarantees. Under a metric sub-regularity assumption, we prove that the proposed parametrization includes only iterations that locally achieve linear convergence-up to a vanishing bias term-and that it encompasses all iterations that do so at a sufficiently fast rate. We then demonstrate how our framework can be used to augment several widely-used operator splitting methods to accelerate the solution of structured monotone inclusion problems, and validate our approach on a best approximation problem using an L2O-augmented Douglas-Rachford splitting algorithm.
Problem

Research questions and friction points this paper is trying to address.

fixed-point iteration
Krasnosel'skii-Mann
nonexpansive mappings
convergence guarantees
monotone inclusion
Innovation

Methods, ideas, or system contributions that make the work stand out.

learning to optimize
Krasnosel'skii-Mann iteration
nonexpansive mappings
metric sub-regularity
operator splitting
🔎 Similar Papers
2024-09-28arXiv.orgCitations: 0
A
Andrea Martin
Division of Decision and Control Systems, KTH Royal Institute of Technology, and Digital Futures, SE-100 44 Stockholm, Sweden
Giuseppe Belgioioso
Giuseppe Belgioioso
KTH Royal Institute of Technology
OptimizationGame TheoryAutomatic Control