🤖 AI Summary
This paper addresses the problem of approximating optimal transport (OT) maps from a continuous source measure μ to a discrete target measure ν in the semi-discrete OT setting, particularly when μ lacks compact support. To tackle this challenge, we propose a stochastic gradient descent (SGD) method and establish, for the first time, statistical convergence guarantees for OT map estimation under MTW-class cost functions and mild regularity assumptions: the averaged projected SGD achieves the minimax-optimal rate of O(1/√n). Our key contribution is an adaptive projection mechanism tailored to non-compact support settings, ensuring that the iterates converge to the true OT map. This theoretical analysis breaks the conventional reliance of SGD-based OT methods on bounded support assumptions. Numerical experiments demonstrate the algorithm’s stability and efficiency across diverse cost functions—including Euclidean distance and cosine similarity—validating its practical robustness.
📝 Abstract
We investigate the semi-discrete Optimal Transport (OT) problem, where a continuous source measure $μ$ is transported to a discrete target measure $ν$, with particular attention to the OT map approximation. In this setting, Stochastic Gradient Descent (SGD) based solvers have demonstrated strong empirical performance in recent machine learning applications, yet their theoretical guarantee to approximate the OT map is an open question. In this work, we answer it positively by providing both computational and statistical convergence guarantees of SGD. Specifically, we show that SGD methods can estimate the OT map with a minimax convergence rate of $mathcal{O}(1/sqrt{n})$, where $n$ is the number of samples drawn from $μ$. To establish this result, we study the averaged projected SGD algorithm, and identify a suitable projection set that contains a minimizer of the objective, even when the source measure is not compactly supported. Our analysis holds under mild assumptions on the source measure and applies to MTW cost functions,whic include $|cdot|^p$ for $p in (1, infty)$. We finally provide numerical evidence for our theoretical results.