🤖 AI Summary
Existing online Riemannian optimization methods on Hadamard manifolds rely on geodesic convexity, leading to regret bounds that explicitly depend on curvature and preventing attainment of Euclidean-optimal rates. Method: This paper introduces the novel framework of horospherical convexity (h-convexity) and designs curvature-independent online optimization algorithms. Specifically, it adapts Riemannian online gradient descent under an affine-invariant metric. Contribution/Results: The proposed method achieves (sqrt{T}) regret for general h-convex objectives and (log T) regret for strongly h-convex objectives—matching the optimal Euclidean rates without explicit curvature dependence. Empirical validation on the symmetric positive definite (SPD) manifold demonstrates robust, curvature-agnostic convergence in two applications: online Tyler’s M-estimation and Fréchet mean computation. This work establishes the first curvature-free regret guarantees for online optimization on Hadamard manifolds.
📝 Abstract
We study online Riemannian optimization on Hadamard manifolds under the framework of horospherical convexity (h-convexity). Prior work mostly relies on the geodesic convexity (g-convexity), leading to regret bounds scaling poorly with the manifold curvature. To address this limitation, we analyze Riemannian online gradient descent for h-convex and strongly h-convex functions and establish $O(sqrt{T})$ and $O(log(T))$ regret guarantees, respectively. These bounds are curvature-independent and match the results in the Euclidean setting. We validate our approach with experiments on the manifold of symmetric positive definite (SPD) matrices equipped with the affine-invariant metric. In particular, we investigate online Tyler's $M$-estimation and online Fréchet mean computation, showing the application of h-convexity in practice.