Convergence Rate in Nonlinear Two-Time-Scale Stochastic Approximation with State (Time)-Dependence

📅 2025-09-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the convergence rate of nonlinear two-timescale stochastic approximation under state- and time-dependent (nonstationary) noise. Addressing the limitation of existing theory in handling such nonstationary noise, we propose a Lyapunov-based analytical framework that integrates the two-timescale update mechanism with Polyak–Ruppert averaging. We establish, for the first time, a polynomial upper bound on the convergence rate. Moreover, we uncover a novel phenomenon: as the noise parameters approach stationarity, the algorithm achieves exponential convergence—a property previously unobserved. Our theoretical results uniformly apply to canonical settings including stochastic gradient descent and stochastic bilevel optimization. Numerical experiments corroborate both the tightness of the derived convergence order and the practical efficacy of the proposed framework.

Technology Category

Application Category

📝 Abstract
The nonlinear two-time-scale stochastic approximation is widely studied under conditions of bounded variances in noise. Motivated by recent advances that allow for variability linked to the current state or time, we consider state- and time-dependent noises. We show that the Lyapunov function exhibits polynomial convergence rates in both cases, with the rate of polynomial delay depending on the parameters of state- or time-dependent noises. Notably, if the state noise parameters fully approach their limiting value, the Lyapunov function achieves an exponential convergence rate. We provide two numerical examples to illustrate our theoretical findings in the context of stochastic gradient descent with Polyak-Ruppert averaging and stochastic bilevel optimization.
Problem

Research questions and friction points this paper is trying to address.

Analyzing convergence rates in nonlinear two-time-scale stochastic approximation
Studying state- and time-dependent noise effects on Lyapunov function convergence
Establishing polynomial and exponential convergence rates under varying conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

State- and time-dependent noise analysis
Lyapunov function polynomial convergence rates
Exponential convergence under limiting parameters
🔎 Similar Papers
No similar papers found.
Z
Zixi Chen
School of Mathematical Sciences, Peking University
Y
Yumin Xu
School of Mathematical Sciences, Peking University
Ruixun Zhang
Ruixun Zhang
Peking University
Sustainable InvestingMachine LearningMarket MicrostructureAdaptive Markets