Variance-Reduced Fast Operator Splitting Methods for Stochastic Generalized Equations

📅 2025-04-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenges of slow convergence, high variance, and difficulty in handling non-monotone operators in solving stochastic generalized equations (SGEs). To this end, we propose two variance-reduced accelerated forward-backward splitting algorithms—AFBS and ABFS—unified for both finite-sum and stochastic settings. Our contributions are threefold: (i) We establish the first almost-sure convergence theory for stochastic accelerated fixed-point algorithms, along with explicit iteration-wise convergence rates; (ii) We design a generic family of variance-reduced estimators—encompassing SVRG, SAGA, and SARAH—that achieve optimal oracle complexity; (iii) We introduce a co-hypomonotonicity analytical framework that relaxes classical monotonicity requirements. Theoretically, we prove that the expected residual converges at the optimal rate of $O(1/k^2)$, and almost surely at the stronger rate of $o(1/k^2)$. Extensive numerical experiments demonstrate significant improvements over state-of-the-art methods on diverse non-monotone problems.

Technology Category

Application Category

📝 Abstract
We develop two classes of variance-reduced fast operator splitting methods to approximate solutions of both finite-sum and stochastic generalized equations. Our approach integrates recent advances in accelerated fixed-point methods, co-hypomonotonicity, and variance reduction. First, we introduce a class of variance-reduced estimators and establish their variance-reduction bounds. This class covers both unbiased and biased instances and comprises common estimators as special cases, including SVRG, SAGA, SARAH, and Hybrid-SGD. Next, we design a novel accelerated variance-reduced forward-backward splitting (FBS) algorithm using these estimators to solve finite-sum and stochastic generalized equations. Our method achieves both $mathcal{O}(1/k^2)$ and $o(1/k^2)$ convergence rates on the expected squared norm $mathbb{E}[ | G_{lambda}x^k|^2]$ of the FBS residual $G_{lambda}$, where $k$ is the iteration counter. Additionally, we establish, for the first time, almost sure convergence rates and almost sure convergence of iterates to a solution in stochastic accelerated methods. Unlike existing stochastic fixed-point algorithms, our methods accommodate co-hypomonotone operators, which potentially include nonmonotone problems arising from recent applications. We further specify our method to derive an appropriate variant for each stochastic estimator -- SVRG, SAGA, SARAH, and Hybrid-SGD -- demonstrating that they achieve the best-known complexity for each without relying on enhancement techniques. Alternatively, we propose an accelerated variance-reduced backward-forward splitting (BFS) method, which attains similar convergence rates and oracle complexity as our FBS method. Finally, we validate our results through several numerical experiments and compare their performance.
Problem

Research questions and friction points this paper is trying to address.

Develops variance-reduced methods for stochastic generalized equations
Achieves accelerated convergence rates for FBS residuals
Extends applicability to co-hypomonotone and nonmonotone operators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variance-reduced fast operator splitting methods
Accelerated variance-reduced forward-backward splitting algorithm
Accommodates co-hypomonotone operators
🔎 Similar Papers
No similar papers found.