🤖 AI Summary
This paper addresses the multivariate normal approximation of Polyak–Ruppert averaged iterates in linear stochastic approximation (LSA), aiming to improve the Berry–Esseen bound’s accuracy and establish non-asymptotic validity of the multiplier bootstrap. Methodologically, it integrates probabilistic approximation theory, multidimensional Stein’s method, and stochastic process analysis under diminishing step sizes. The main contributions are two-fold: first, it establishes, for the first time, a Berry–Esseen bound of order $O(n^{-1/3})$ in convex distance; second, it rigorously proves that the multiplier bootstrap achieves uniform approximation of the scaled error distribution with rate $O(n^{-1/2})$, yielding the first non-asymptotically valid result for this procedure. Collectively, these advances substantially strengthen finite-sample theoretical guarantees and practical reliability for statistical inference based on LSA estimators.
📝 Abstract
In this paper, we refine the Berry-Esseen bounds for the multivariate normal approximation of Polyak-Ruppert averaged iterates arising from the linear stochastic approximation (LSA) algorithm with decreasing step size. We consider the normal approximation by the Gaussian distribution with covariance matrix predicted by the Polyak-Juditsky central limit theorem and establish the rate up to order $n^{-1/3}$ in convex distance, where $n$ is the number of samples used in the algorithm. We also prove a non-asymptotic validity of the multiplier bootstrap procedure for approximating the distribution of the rescaled error of the averaged LSA estimator. We establish approximation rates of order up to $1/sqrt{n}$ for the latter distribution, which significantly improves upon the previous results obtained by Samsonov et al. (2024).