Stochastic Modified Flows for Riemannian Stochastic Gradient Descent

πŸ“… 2024-02-02
πŸ›οΈ SIAM Journal of Control and Optimization
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the continuous-limit approximation of Riemannian stochastic gradient descent (RSGD) under small learning rates. Recognizing the limited accuracy of the classical Riemannian gradient flow (RGF) as an approximation, we propose the Riemannian stochastic modified flow (RSMF)β€”a diffusion process driven by an infinite-dimensional Wiener process. Methodologically, we establish, for the first time, a quantitative weak error bound between RSGD and RSMF, unifying the coupled effects of manifold geometry, discretization, and noise via retraction mappings. Theoretically, we prove that RSMF achieves one higher order of approximation to RSGD than RGF; we derive an explicit upper bound on the approximation error, which depends explicitly on manifold curvature, retraction quality, and gradient estimation variance. This work provides a more accurate and geometrically consistent stochastic differential-geometric framework for dynamic modeling and convergence analysis in Riemannian optimization.

Technology Category

Application Category

πŸ“ Abstract
We give quantitative estimates for the rate of convergence of Riemannian stochastic gradient descent (RSGD) to Riemannian gradient flow and to a diffusion process, the so-called Riemannian stochastic modified flow (RSMF). Using tools from stochastic differential geometry we show that, in the small learning rate regime, RSGD can be approximated by the solution to the RSMF driven by an infinite-dimensional Wiener process. The RSMF accounts for the random fluctuations of RSGD and, thereby, increases the order of approximation compared to the deterministic Riemannian gradient flow. The RSGD is build using the concept of a retraction map, that is, a cost efficient approximation of the exponential map, and we prove quantitative bounds for the weak error of the diffusion approximation under assumptions on the retraction map, the geometry of the manifold, and the random estimators of the gradient.
Problem

Research questions and friction points this paper is trying to address.

Quantify convergence rate of Riemannian stochastic gradient descent.
Approximate RSGD by Riemannian stochastic modified flow.
Analyze weak error bounds under geometric and retraction assumptions.
Innovation

Methods, ideas, or system contributions that make the work stand out.

RSGD approximated by Riemannian stochastic modified flow
Retraction map used for cost-efficient exponential approximation
Quantitative bounds for weak error in diffusion approximation
πŸ”Ž Similar Papers
No similar papers found.