Momentum-based gradient descent methods for Lie groups

📅 2024-04-14
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Nesterov Accelerated Gradient (NAG) methods lack a rigorous theoretical generalization to Lie groups, with prior work limited to Polyak-type (PHB) momentum schemes. Method: This paper establishes the first variational framework for constructing NAG-style momentum algorithms on Lie groups. Leveraging left- or right-invariant Riemannian metrics, it models gradient flows on Lie groups and discretizes their continuous-time dynamics while preserving intrinsic geometric structure. This yields a precise correspondence between classical and accelerated momentum methods on nonlinear manifolds. Contribution/Results: The framework provides the first theoretically sound and implementable NAG-type acceleration scheme for optimization on Lie groups. Empirical evaluation on canonical tasks—such as optimization over SO(3) and SE(3)—demonstrates substantial improvements in convergence speed and numerical stability compared to existing approaches, thereby bridging a fundamental gap in geometric optimization theory.

Technology Category

Application Category

📝 Abstract
Polyak's Heavy Ball (PHB; Polyak, 1964), a.k.a. Classical Momentum, and Nesterov's Accelerated Gradient (NAG; Nesterov, 1983) are well know examples of momentum-descent methods for optimization. While the latter outperforms the former, solely generalizations of PHB-like methods to nonlinear spaces have been described in the literature. We propose here a generalization of NAG-like methods for Lie group optimization based on the variational one-to-one correspondence between classical and accelerated momentum methods (Campos et al., 2023). Numerical experiments are shown.
Problem

Research questions and friction points this paper is trying to address.

Generalizing NAG-like methods for Lie group optimization
Exploring PHB-like methods in nonlinear spaces
Demonstrating faster convergence for NAG in Lie groups
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalizes NAG for Lie group optimization
Uses variational momentum correspondence
Demonstrates faster convergence with NAG
🔎 Similar Papers
No similar papers found.
C
Cédric M. Campos
Departamento de Matemática Aplicada, Ciencia e Ingeniería de los Materiales y Tecnología Electrónica, Universidad Rey Juan Carlos, Calle Tulipán s/n, 28933 Móstoles, Spain
D
David Martín de Diego
Instituto de Ciencias Matemáticas (CSIC-UAM-UC3M-UCM), Calle Nicolás Cabrera 13-15, 28049 Madrid, Spain
J
José Torrente Teruel
Departamento de Matemáticas, Universidad de Córdoba, Edificio Albert Einstein, Campus de Rabanales, 14071 Córdoba, Spain