🤖 AI Summary
Existing Nesterov Accelerated Gradient (NAG) methods lack a rigorous theoretical generalization to Lie groups, with prior work limited to Polyak-type (PHB) momentum schemes. Method: This paper establishes the first variational framework for constructing NAG-style momentum algorithms on Lie groups. Leveraging left- or right-invariant Riemannian metrics, it models gradient flows on Lie groups and discretizes their continuous-time dynamics while preserving intrinsic geometric structure. This yields a precise correspondence between classical and accelerated momentum methods on nonlinear manifolds. Contribution/Results: The framework provides the first theoretically sound and implementable NAG-type acceleration scheme for optimization on Lie groups. Empirical evaluation on canonical tasks—such as optimization over SO(3) and SE(3)—demonstrates substantial improvements in convergence speed and numerical stability compared to existing approaches, thereby bridging a fundamental gap in geometric optimization theory.
📝 Abstract
Polyak's Heavy Ball (PHB; Polyak, 1964), a.k.a. Classical Momentum, and Nesterov's Accelerated Gradient (NAG; Nesterov, 1983) are well know examples of momentum-descent methods for optimization. While the latter outperforms the former, solely generalizations of PHB-like methods to nonlinear spaces have been described in the literature. We propose here a generalization of NAG-like methods for Lie group optimization based on the variational one-to-one correspondence between classical and accelerated momentum methods (Campos et al., 2023). Numerical experiments are shown.