A Class of Accelerated Fixed-Point-Based Methods with Delayed Inexact Oracles and Its Applications

📅 2025-12-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the efficient computation of fixed points of nonexpansive operators (equivalently, zeros of co-coercive operators) under delayed and inexact oracle evaluations. We propose a novel framework that integrates Nesterov acceleration with the Krasnosel’skii–Mann iteration, and introduce the first unified error model for delayed inexact oracles—accommodating asynchronous, stochastic, and deterministic updates. Theoretically, we establish non-asymptotic $O(1/k^2)$ convergence and asymptotic (or almost-sure) $o(1/k^2)$ convergence rates for the squared residual norm—significantly improving upon the $O(1/k)$ rate of classical KM-type methods. Moreover, the iteration complexity scales linearly with the maximum delay. We empirically validate the efficacy and practicality of our approach on matrix games and shallow neural network training.

Technology Category

Application Category

📝 Abstract
In this paper, we develop a novel accelerated fixed-point-based framework using delayed inexact oracles to approximate a fixed point of a nonexpansive operator (or equivalently, a root of a co-coercive operator), a central problem in scientific computing. Our approach leverages both Nesterov's acceleration technique and the Krasnosel'skii-Mann (KM) iteration, while accounting for delayed inexact oracles, a key mechanism in asynchronous algorithms. We also introduce a unified approximate error condition for delayed inexact oracles, which can cover various practical scenarios. Under mild conditions and appropriate parameter updates, we establish both $mathcal{O}(1/k^2)$ non-asymptotic and $o(1/k^2)$ asymptotic convergence rates in expectation for the squared norm of residual. Our rate significantly improves the $mathcal{O}(1/k)$ rates in classical KM-type methods, including their asynchronous variants. We also establish $o(1/k^2)$ almost sure convergence rates and the almost sure convergence of iterates to a solution of the problem. Within our framework, we instantiate three settings for the underlying operator: (i) a deterministic universal delayed oracle; (ii) a stochastic delayed oracle; and (iii) a finite-sum structure with asynchronous updates. For each case, we instantiate our framework to obtain a concrete algorithmic variant for which our convergence results still apply, and whose iteration complexity depends linearly on the maximum delay. Finally, we verify our algorithms and theoretical results through two numerical examples on both matrix game and shallow neural network training problems.
Problem

Research questions and friction points this paper is trying to address.

Accelerates fixed-point methods with delayed inexact oracles
Improves convergence rates for nonexpansive operator problems
Applies to asynchronous and stochastic optimization scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Accelerated fixed-point methods with delayed inexact oracles
Unified error condition covering various practical scenarios
Improved convergence rates over classical asynchronous variants
🔎 Similar Papers
No similar papers found.
N
Nghia Nguyen-Trung
Department of Statistics and Operations Research, The University of North Carolina at Chapel Hill
Quoc Tran-Dinh
Quoc Tran-Dinh
Department of Statistics and Operations Research, UNC
convex optimizationnonlinear programmingoptimization for machine learning