🤖 AI Summary
This work addresses the existence and explicit construction of Busemann functions on the Wasserstein space. Focusing on two fundamental cases—univariate distributions and Gaussian measures—we derive, for the first time, closed-form expressions for their Busemann functions. Leveraging the Riemannian geometry induced by optimal transport, we explicitly compute Busemann projections of probability measures along geodesic rays. Based on this, we propose the Busemann-sliced Wasserstein distance (BSWD), which enriches geometric modeling in distribution spaces. Our approach integrates Wasserstein geometry, geodesic analysis, and Busemann theory. We validate its effectiveness on Gaussian mixture models and labeled datasets, and demonstrate—on synthetic data and transfer learning tasks—that BSWD outperforms the conventional sliced Wasserstein distance in computational efficiency, interpretability, and discriminative power.
📝 Abstract
The Busemann function has recently found much interest in a variety of geometric machine learning problems, as it naturally defines projections onto geodesic rays of Riemannian manifolds and generalizes the notion of hyperplanes. As several sources of data can be conveniently modeled as probability distributions, it is natural to study this function in the Wasserstein space, which carries a rich formal Riemannian structure induced by Optimal Transport metrics. In this work, we investigate the existence and computation of Busemann functions in Wasserstein space, which admits geodesic rays. We establish closed-form expressions in two important cases: one-dimensional distributions and Gaussian measures. These results enable explicit projection schemes for probability distributions on $mathbb{R}$, which in turn allow us to define novel Sliced-Wasserstein distances over Gaussian mixtures and labeled datasets. We demonstrate the efficiency of those original schemes on synthetic datasets as well as transfer learning problems.