Improved Central Limit Theorem and Bootstrap Approximations for Linear Stochastic Approximation

📅 2025-10-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the multivariate normal approximation of Polyak–Ruppert averaged iterates in linear stochastic approximation (LSA), aiming to improve the Berry–Esseen bound’s accuracy and establish non-asymptotic validity of the multiplier bootstrap. Methodologically, it integrates probabilistic approximation theory, multidimensional Stein’s method, and stochastic process analysis under diminishing step sizes. The main contributions are two-fold: first, it establishes, for the first time, a Berry–Esseen bound of order $O(n^{-1/3})$ in convex distance; second, it rigorously proves that the multiplier bootstrap achieves uniform approximation of the scaled error distribution with rate $O(n^{-1/2})$, yielding the first non-asymptotically valid result for this procedure. Collectively, these advances substantially strengthen finite-sample theoretical guarantees and practical reliability for statistical inference based on LSA estimators.

Technology Category

Application Category

📝 Abstract
In this paper, we refine the Berry-Esseen bounds for the multivariate normal approximation of Polyak-Ruppert averaged iterates arising from the linear stochastic approximation (LSA) algorithm with decreasing step size. We consider the normal approximation by the Gaussian distribution with covariance matrix predicted by the Polyak-Juditsky central limit theorem and establish the rate up to order $n^{-1/3}$ in convex distance, where $n$ is the number of samples used in the algorithm. We also prove a non-asymptotic validity of the multiplier bootstrap procedure for approximating the distribution of the rescaled error of the averaged LSA estimator. We establish approximation rates of order up to $1/sqrt{n}$ for the latter distribution, which significantly improves upon the previous results obtained by Samsonov et al. (2024).
Problem

Research questions and friction points this paper is trying to address.

Refining Berry-Esseen bounds for LSA normal approximation
Establishing non-asymptotic bootstrap validity for LSA errors
Improving convergence rates for averaged estimator distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Refined Berry-Esseen bounds for Polyak-Ruppert averaging
Established Gaussian approximation rates up to n^{-1/3}
Proved multiplier bootstrap validity with 1/√n rates
🔎 Similar Papers
No similar papers found.
Bogdan Butyrin
Bogdan Butyrin
НИУ ВШЭ
ProbabilityStatistics
Eric Moulines
Eric Moulines
Professeur, Ecole Polytechnique, Membre de l'Académie des Sciences
StatisticsMachine learningSignal Processing
Alexey Naumov
Alexey Naumov
Professor, HSE University
probability theorystatisticsmachine learningrandom matricesreinforcement learning
Sergey Samsonov
Sergey Samsonov
HSE university, Moscow
high-dimensional probabilityMarkov ChainsMCMC
Q
Qi-Man Shao
Shenzhen International Center of Mathematics, Southern University of Science and Technology, Xueyuan Blvd., 518000, Shenzhen, P.R. China
Z
Zhuo-Song Zhang
Shenzhen International Center of Mathematics, Southern University of Science and Technology, Xueyuan Blvd., 518000, Shenzhen, P.R. China