🤖 AI Summary
This work addresses online optimization and minimax problems on Hadamard manifolds. We propose the first curvature-independent implicit Riemannian optimistic online learning algorithm, supporting intrinsic manifold constraints. Our method employs implicit iterative updates and a novel Riemannian optimistic gradient design, achieving the optimal Euclidean regret bound of $O(sqrt{T})$ without requiring any geometric constants—such as curvature lower bounds—for the first time. We further extend the framework to $g$-convex–$g$-concave smooth minimax optimization, attaining a gradient complexity of $ ilde{O}(1/varepsilon^2)$, which nearly matches the theoretical lower bound and significantly improves upon existing manifold-based algorithms. Key contributions include: (i) eliminating curvature dependence entirely; (ii) unifying treatment of online learning and minimax optimization on manifolds; and (iii) recovering Euclidean optimality in nonlinear geometric spaces.
📝 Abstract
We introduce a Riemannian optimistic online learning algorithm for Hadamard manifolds based on inexact implicit updates. Unlike prior work, our method can handle in-manifold constraints, and matches the best known regret bounds in the Euclidean setting with no dependence on geometric constants, like the minimum curvature. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.