Online Optimization on Hadamard Manifolds: Curvature Independent Regret Bounds on Horospherically Convex Objectives

📅 2025-09-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing online Riemannian optimization methods on Hadamard manifolds rely on geodesic convexity, leading to regret bounds that explicitly depend on curvature and preventing attainment of Euclidean-optimal rates. Method: This paper introduces the novel framework of horospherical convexity (h-convexity) and designs curvature-independent online optimization algorithms. Specifically, it adapts Riemannian online gradient descent under an affine-invariant metric. Contribution/Results: The proposed method achieves (sqrt{T}) regret for general h-convex objectives and (log T) regret for strongly h-convex objectives—matching the optimal Euclidean rates without explicit curvature dependence. Empirical validation on the symmetric positive definite (SPD) manifold demonstrates robust, curvature-agnostic convergence in two applications: online Tyler’s M-estimation and Fréchet mean computation. This work establishes the first curvature-free regret guarantees for online optimization on Hadamard manifolds.

Technology Category

Application Category

📝 Abstract
We study online Riemannian optimization on Hadamard manifolds under the framework of horospherical convexity (h-convexity). Prior work mostly relies on the geodesic convexity (g-convexity), leading to regret bounds scaling poorly with the manifold curvature. To address this limitation, we analyze Riemannian online gradient descent for h-convex and strongly h-convex functions and establish $O(sqrt{T})$ and $O(log(T))$ regret guarantees, respectively. These bounds are curvature-independent and match the results in the Euclidean setting. We validate our approach with experiments on the manifold of symmetric positive definite (SPD) matrices equipped with the affine-invariant metric. In particular, we investigate online Tyler's $M$-estimation and online Fréchet mean computation, showing the application of h-convexity in practice.
Problem

Research questions and friction points this paper is trying to address.

Achieving curvature-independent regret bounds in online Riemannian optimization
Overcoming limitations of geodesic convexity with horospherical convexity
Providing regret guarantees matching Euclidean results for h-convex functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using horospherical convexity for optimization
Riemannian online gradient descent algorithm
Curvature-independent regret bounds achieved
🔎 Similar Papers
E
Emre Sahinoglu
Department of Mechanical & Industrial Engineering, Northeastern University, Boston, MA 02115
Shahin Shahrampour
Shahin Shahrampour
Assistant Professor, Northeastern University
Optimization and ControlMulti-Agent SystemsMachine LearningReinforcement Learning