A regret minimization approach to fixed-point iterations

šŸ“… 2025-09-25
šŸ“ˆ Citations: 0
✨ Influential: 0
šŸ“„ PDF
šŸ¤– AI Summary
This work addresses fixed-point computation for non-self-mappings. We systematically transform online regret minimization algorithms—particularly Online Gradient Descent (OGD) and AdaGrad-type methods—into fixed-point iteration schemes with provable convergence guarantees. The proposed framework generalizes the classical Krasnoselskii–Mann iteration and, for the first time, introduces regret analysis from online optimization to derive adaptive fixed-point updates in the non-self-mapping setting. Theoretically, under standard assumptions (e.g., cocoercivity or monotonicity), our methods achieve strong convergence and exhibit explicit iteration complexity bounds. Empirically, the AdaGrad-based adaptive scheme significantly outperforms the standard Krasnoselskii–Mann method on challenging nonsmooth and nonmonotone fixed-point problems—including variational inequalities and implicit deep network solving—demonstrating both accelerated convergence and enhanced robustness.

Technology Category

Application Category

šŸ“ Abstract
We propose a conversion scheme that turns regret minimizing algorithms into fixed point iterations, with convergence guarantees following from regret bounds. The resulting iterations can be seen as a grand extension of the classical Krasnoselskii--Mann iterations, as the latter are recovered by converting the Online Gradient Descent algorithm. This approach yields new simple iterations for finding fixed points of non-self operators. We also focus on converting algorithms from the AdaGrad family of regret minimizers, and thus obtain fixed point iterations with adaptive guarantees of a new kind. Numerical experiments on various problems demonstrate faster convergence of AdaGrad-based fixed point iterations over Krasnoselskii--Mann iterations.
Problem

Research questions and friction points this paper is trying to address.

Converting regret minimization algorithms into fixed-point iterations with convergence guarantees
Extending classical Krasnoselskii-Mann iterations using Online Gradient Descent
Developing adaptive fixed-point iterations via AdaGrad for faster convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Converting regret minimization algorithms into fixed-point iterations
Extending Krasnoselskii-Mann iterations via Online Gradient Descent
Developing adaptive fixed-point iterations using AdaGrad family