Suspicious Alignment of SGD: A Fine-Grained Step Size Condition Analysis

πŸ“… 2026-01-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work investigates the "suspicious alignment" phenomenon in stochastic gradient descent (SGD) applied to ill-conditioned high-dimensional quadratic optimization, where gradients become highly aligned with the dominant eigenspace of the Hessian yet fail to effectively reduce the loss. Through spectral decomposition of the Hessian and dynamic analysis of gradient alignment, the study reveals a two-phase evolution mechanism under constant step size and large initialization: an initial phase of alignment-driven descent followed by stabilization at high alignment. The authors introduce an adaptive critical step size condition that characterizes how step size differentially influences alignment dynamics in low- versus high-alignment regions. They further identify a specific step size regime wherein projections onto the full space remain effective while those onto the dominant Hessian subspace break down, thereby elucidating the origin of this paradoxical behavior.

Technology Category

Application Category

πŸ“ Abstract
This paper explores the suspicious alignment phenomenon in stochastic gradient descent (SGD) under ill-conditioned optimization, where the Hessian spectrum splits into dominant and bulk subspaces. This phenomenon describes the behavior of gradient alignment in SGD updates. Specifically, during the initial phase of SGD updates, the alignment between the gradient and the dominant subspace tends to decrease. Subsequently, it enters a rising phase and eventually stabilizes in a high-alignment phase. The alignment is considered ``suspicious''because, paradoxically, the projected gradient update along this highly-aligned dominant subspace proves ineffective at reducing the loss. The focus of this work is to give a fine-grained analysis in a high-dimensional quadratic setup about how step size selection produces this phenomenon. Our main contribution can be summarized as follows: We propose a step-size condition revealing that in low-alignment regimes, an adaptive critical step size $\eta_t^*$ separates alignment-decreasing ($\eta_t<\eta_t^*$) from alignment-increasing ($\eta_t>\eta_t^*$) regimes, whereas in high-alignment regimes, the alignment is self-correcting and decreases regardless of the step size. We further show that under sufficient ill-conditioning, a step size interval exists where projecting the SGD updates to the bulk space decreases the loss while projecting them to the dominant space increases the loss, which explains a recent empirical observation that projecting gradient updates to the dominant subspace is ineffective. Finally, based on this adaptive step-size theory, we prove that for a constant step size and large initialization, SGD exhibits this distinct two-phase behavior: an initial alignment-decreasing phase, followed by stabilization at high alignment.
Problem

Research questions and friction points this paper is trying to address.

suspicious alignment
stochastic gradient descent
ill-conditioned optimization
Hessian spectrum
gradient alignment
Innovation

Methods, ideas, or system contributions that make the work stand out.

suspicious alignment
stochastic gradient descent
step size condition
ill-conditioned optimization
Hessian spectrum
πŸ”Ž Similar Papers
2024-05-25arXiv.orgCitations: 3