Understanding Adam Requires Better Rotation Dependent Assumptions

📅 2024-10-25
🏛️ arXiv.org
📈 Citations: 4
Influential: 1
📄 PDF
🤖 AI Summary
Adam is commonly assumed to be rotationally invariant, yet its actual sensitivity to parameter-space rotations remains poorly understood. Method: We conduct systematic empirical studies on Transformer models, evaluating Adam’s behavior under both random and structured rotations—including layer-wise orthogonal transformations. Contribution/Results: We demonstrate that random rotations substantially degrade Adam’s convergence speed and final performance, whereas certain structured rotations preserve or even improve optimization outcomes. Existing theoretical assumptions regarding rotational dependence fail to explain this dichotomy coherently. This work provides the first empirical evidence establishing Adam’s intrinsic rotational dependence, exposing critical limitations in conventional optimization theory. It calls for a new theoretical framework explicitly accounting for rotational effects in adaptive optimization. Our findings offer a novel paradigm for understanding the advantages of adaptive optimizers and inform the design of more robust, geometry-aware optimization algorithms.

Technology Category

Application Category

📝 Abstract
Despite its widespread adoption, Adam's advantage over Stochastic Gradient Descent (SGD) lacks a comprehensive theoretical explanation. This paper investigates Adam's sensitivity to rotations of the parameter space. We demonstrate that Adam's performance in training transformers degrades under random rotations of the parameter space, indicating a crucial sensitivity to the choice of basis. This reveals that conventional rotation-invariant assumptions are insufficient to capture Adam's advantages theoretically. To better understand the rotation-dependent properties that benefit Adam, we also identify structured rotations that preserve or even enhance its empirical performance. We then examine the rotation-dependent assumptions in the literature, evaluating their adequacy in explaining Adam's behavior across various rotation types. This work highlights the need for new, rotation-dependent theoretical frameworks to fully understand Adam's empirical success in modern machine learning tasks.
Problem

Research questions and friction points this paper is trying to address.

Adam's sensitivity to parameter space rotations challenges theoretical assumptions
Conventional rotation-invariant assumptions fail to explain Adam's empirical advantages
Orthogonality of updates may explain Adam's basis-dependent performance characteristics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzes Adam's sensitivity to parameter space rotations
Identifies structured rotations preserving Adam's performance
Proposes orthogonality of updates as key theoretical indicator
🔎 Similar Papers
No similar papers found.