π€ AI Summary
Standard optimizers (e.g., SGD, AdamW, Lion) rely on ββ- or ββ-norm-based steepest descent, which fails under the non-Euclidean geometric structures prevalent in deep neural network training.
Method: We propose Staceyβa novel accelerated stochastic steepest descent algorithm for non-Euclidean ββ-smooth nonconvex optimization with adaptive ββ-norm updates (p β (1,2)), built upon an interpolation-type primal-dual iteration scheme.
Contribution/Results: Stacey is the first method to establish provably accelerated convergence guarantees for ββ-smooth nonconvex optimizationβresolving a long-standing theoretical gap. Empirically, it achieves faster convergence and higher final accuracy than state-of-the-art optimizers across image classification and large language model pretraining. Its core innovation lies in breaking the ββ/ββ paradigm: by enabling task- and model-adaptive tuning of p, Stacey unlocks superior optimization dynamics tailored to intrinsic non-Euclidean geometries.
π Abstract
While popular optimization methods such as SGD, AdamW, and Lion depend on steepest descent updates in either $ell_2$ or $ell_infty$ norms, there remains a critical gap in handling the non-Euclidean structure observed in modern deep networks training. In this work, we address this need by introducing a new accelerated $ell_p$ steepest descent algorithm, called Stacey, which uses interpolated primal-dual iterate sequences to effectively navigate non-Euclidean smooth optimization tasks. In addition to providing novel theoretical guarantees for the foundations of our algorithm, we empirically compare our approach against these popular methods on tasks including image classification and language model (LLM) pretraining, demonstrating both faster convergence and higher final accuracy. We further evaluate different values of $p$ across various models and datasets, underscoring the importance and efficiency of non-Euclidean approaches over standard Euclidean methods. Code can be found at https://github.com/xinyuluo8561/Stacey .