🤖 AI Summary
Gradient descent suffers from poor robustness and efficiency in large-scale nonconvex optimization due to its lack of natural scaling, rendering it highly sensitive to learning rate tuning. To address this, we propose a Hessian-aware adaptive gradient scaling method that—uniquely in the nonconvex setting—guarantees locally unit-length steps: it geometrically rescales gradients via local curvature estimation, eliminating the need for line search or hyperparameter adjustment. Theoretically, under a weak Lipschitz gradient assumption, we establish global convergence and linear convergence rates near approximate minimizers; the method further exhibits robustness to inexact Hessian approximations. Empirically, it consistently accelerates convergence and enhances stability across diverse convex and nonconvex machine learning tasks.
📝 Abstract
Gradient descent is the primary workhorse for optimizing large-scale problems in machine learning. However, its performance is highly sensitive to the choice of the learning rate. A key limitation of gradient descent is its lack of natural scaling, which often necessitates expensive line searches or heuristic tuning to determine an appropriate step size. In this paper, we address this limitation by incorporating Hessian information to scale the gradient direction. By accounting for the curvature of the function along the gradient, our adaptive, Hessian-aware scaling method ensures a local unit step size guarantee, even in nonconvex settings. Near a local minimum that satisfies the second-order sufficient conditions, our approach achieves linear convergence with a unit step size. We show that our method converges globally under a significantly weaker version of the standard Lipschitz gradient smoothness assumption. Even when Hessian information is inexact, the local unit step size guarantee and global convergence properties remain valid under mild conditions. Finally, we validate our theoretical results empirically on a range of convex and nonconvex machine learning tasks, showcasing the effectiveness of the approach.