Convergence Analysis of the PAGE Stochastic Algorithm for Convex Finite-Sum Optimization

📅 2025-08-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the convergence of the PAGE stochastic algorithm for smooth convex finite-sum optimization. We extend PAGE—originally analyzed in nonconvex settings—to the convex regime, where the objective is the average of smooth convex functions. Our method introduces a novel analytical framework that integrates stochastic gradient estimation with periodic variance reduction, leveraging Lipschitz continuity and smoothness assumptions to derive tighter theoretical bounds. Compared to PAGE’s known complexity in nonconvex problems, we establish a sharper convergence rate and significantly reduce the total gradient computation complexity. The main contributions are: (1) the first accelerated convergence theory for PAGE in convex finite-sum optimization; (2) a rigorous characterization of how variance reduction amplifies convergence under convexity; and (3) new theoretical benchmarks and precise applicability boundaries for PAGE-type algorithms.

Technology Category

Application Category

📝 Abstract
PAGE is a stochastic algorithm proposed by Li et al. [2021] to find a stationary point of an average of smooth nonconvex functions. We analyze PAGE in the convex setting and derive new convergence rates, leading to a better complexity than in the general nonconvex regime.
Problem

Research questions and friction points this paper is trying to address.

Analyzing PAGE algorithm convergence for convex optimization
Deriving improved convergence rates for convex finite-sum problems
Enhancing complexity bounds compared to nonconvex settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzes PAGE algorithm for convex optimization
Derives new convergence rates for efficiency
Improves complexity over nonconvex regime
🔎 Similar Papers
No similar papers found.