Statistical Inference for Linear Functionals of Online Least-squares SGD when $t gtrsim d^{1+δ}$

📅 2025-10-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses statistical inference for linear functionals in high-dimensional online least-squares stochastic gradient descent (SGD): specifically, how to construct data-driven, scalable confidence intervals without batch processing or explicit covariance matrix inversion when the number of iterations satisfies $t gtrsim d^{1+delta}$. We propose the first fully online inference framework, establishing a non-asymptotic Gaussian central limit theorem via a Berry–Esseen bound under the near-optimal dimensional growth condition $t gtrsim d^{1+delta}$. We design a novel online variance estimator achieving $mathcal{O}(d)$ memory and $mathcal{O}(td)$ time complexity. Our method provides high-probability deviation control and, for the first time, enables fully adaptive, hyperparameter-free confidence interval construction in high dimensions ($d$ large), without requiring problem-specific tuning or offline post-processing.

Technology Category

Application Category

📝 Abstract
Stochastic Gradient Descent (SGD) has become a cornerstone method in modern data science. However, deploying SGD in high-stakes applications necessitates rigorous quantification of its inherent uncertainty. In this work, we establish emph{non-asymptotic Berry--Esseen bounds} for linear functionals of online least-squares SGD, thereby providing a Gaussian Central Limit Theorem (CLT) in a emph{growing-dimensional regime}. Existing approaches to high-dimensional inference for projection parameters, such as~cite{chang2023inference}, rely on inverting empirical covariance matrices and require at least $t gtrsim d^{3/2}$ iterations to achieve finite-sample Berry--Esseen guarantees, rendering them computationally expensive and restrictive in the allowable dimensional scaling. In contrast, we show that a CLT holds for SGD iterates when the number of iterations grows as $t gtrsim d^{1+δ}$ for any $δ> 0$, significantly extending the dimensional regime permitted by prior works while improving computational efficiency. The proposed online SGD-based procedure operates in $mathcal{O}(td)$ time and requires only $mathcal{O}(d)$ memory, in contrast to the $mathcal{O}(td^2 + d^3)$ runtime of covariance-inversion methods. To render the theory practically applicable, we further develop an emph{online variance estimator} for the asymptotic variance appearing in the CLT and establish emph{high-probability deviation bounds} for this estimator. Collectively, these results yield the first fully online and data-driven framework for constructing confidence intervals for SGD iterates in the near-optimal scaling regime $t gtrsim d^{1+δ}$.
Problem

Research questions and friction points this paper is trying to address.

Establishing non-asymptotic Berry-Esseen bounds for SGD linear functionals
Providing Gaussian CLT guarantees in growing-dimensional regime t ≳ d¹⁺δ
Developing online framework for confidence intervals with improved computational efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Non-asymptotic Berry-Esseen bounds for SGD linear functionals
Online SGD procedure with O(td) time and O(d) memory
Online variance estimator with high-probability deviation bounds
🔎 Similar Papers
No similar papers found.
B
Bhavya Agrawalla
Computer Science Department, Carnegie Mellon University
Krishnakumar Balasubramanian
Krishnakumar Balasubramanian
University of California, Davis
StatisticsOptimizationMachine learning
P
Promit Ghosal
Department of Statistics, University of Chicago