Mirror Descent on Riemannian Manifolds

📅 2026-03-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Optimization over large-scale Riemannian manifolds poses significant challenges for the direct application of classical mirror descent methods. This work proposes a Riemannian Mirror Descent (RMD) framework that generalizes mirror descent to arbitrary Riemannian manifolds through reparameterization and develops its stochastic variant. For the first time, non-asymptotic convergence guarantees are established for mirror descent on general Riemannian manifolds. When specialized to the Stiefel manifold, the framework naturally yields Riemannian gradient descent and its stochastic extension. The theoretical analysis integrates tools from Riemannian geometry, stochastic gradient estimation, and non-convex optimization. Empirical evaluations demonstrate the efficiency and scalability of the proposed methods across diverse tasks, including image processing, policy optimization, and neural network training.

Technology Category

Application Category

📝 Abstract
Mirror Descent (MD) is a scalable first-order method widely used in large-scale optimization, with applications in image processing, policy optimization, and neural network training. This paper generalizes MD to optimization on Riemannian manifolds. In particular, we develop a Riemannian Mirror Descent (RMD) framework via reparameterization and further propose a stochastic variant of RMD. We also establish non-asymptotic convergence guarantees for both RMD and stochastic RMD. As an application to the Stiefel manifold, our RMD framework reduces to the Curvilinear Gradient Descent (CGD) method proposed in [26]. Moreover, when specializing the stochastic RMD framework to the Stiefel setting, we obtain a stochastic extension of CGD, which effectively addresses large-scale manifold optimization problems.
Problem

Research questions and friction points this paper is trying to address.

Mirror Descent
Riemannian Manifolds
Manifold Optimization
Stiefel Manifold
Large-scale Optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Riemannian Mirror Descent
stochastic optimization
manifold optimization
non-asymptotic convergence
Stiefel manifold
🔎 Similar Papers
No similar papers found.
J
Jiaxin Jiang
School of Mathematical Sciences, Fudan University, Shanghai 200433, China
Lei Shi
Lei Shi
Professor in Fudan University, Department of Physics
Photonics and Materials Science
J
Jiyuan Tan
School of Mathematical Sciences, Fudan University, Shanghai 200433, China