Doubly Regularized Entropic Wasserstein Barycenters

📅 2023-03-21
🏛️ arXiv.org
📈 Citations: 14
Influential: 1
📄 PDF
🤖 AI Summary
This paper addresses the modeling and optimization of regularized Wasserstein barycenters. To overcome limitations of existing approaches—particularly concerning regularization design, stability, and mesh-free optimization—we propose the doubly regularized entropy-Wasserstein barycenter, which jointly minimizes the entropy-optimal transport (EOT) cost and dual entropy regularization terms (acting on both the barycenter and marginal couplings). Theoretically, when the regularization parameters satisfy τ = λ/2, the estimation bias is O(λ²), yielding asymptotically unbiased estimation; the barycenter density is smooth, and the solution exhibits strong stability against perturbations in marginal distributions. Algorithmically, we employ noisy-particle gradient descent, achieving global exponential convergence in the mean-field limit; with n samples, the relative entropy converges at rate O(n⁻¹/²). Our framework unifies and generalizes major barycenter models, combining theoretical rigor with computational scalability.
📝 Abstract
We study a general formulation of regularized Wasserstein barycenters that enjoys favorable regularity, approximation, stability and (grid-free) optimization properties. This barycenter is defined as the unique probability measure that minimizes the sum of entropic optimal transport (EOT) costs with respect to a family of given probability measures, plus an entropy term. We denote it $(lambda, au)$-barycenter, where $lambda$ is the inner regularization strength and $ au$ the outer one. This formulation recovers several previously proposed EOT barycenters for various choices of $lambda, au geq 0$ and generalizes them. First, in spite of -- and in fact owing to -- being emph{doubly} regularized, we show that our formulation is debiased for $ au=lambda/2$: the suboptimality in the (unregularized) Wasserstein barycenter objective is, for smooth densities, of the order of the strength $lambda^2$ of entropic regularization, instead of $max{lambda, au}$ in general. We discuss this phenomenon for isotropic Gaussians where all $(lambda, au)$-barycenters have closed form. Second, we show that for $lambda, au>0$, this barycenter has a smooth density and is strongly stable under perturbation of the marginals. In particular, it can be estimated efficiently: given $n$ samples from each of the probability measures, it converges in relative entropy to the population barycenter at a rate $n^{-1/2}$. And finally, this formulation lends itself naturally to a grid-free optimization algorithm: we propose a simple emph{noisy particle gradient descent} which, in the mean-field limit, converges globally at an exponential rate to the barycenter.
Problem

Research questions and friction points this paper is trying to address.

Debiased regularized Wasserstein barycenters for smooth densities
Strong stability and smooth density under marginal perturbations
Grid-free optimization with noisy particle gradient descent
Innovation

Methods, ideas, or system contributions that make the work stand out.

Doubly regularized entropic Wasserstein barycenters formulation
Debiased for specific regularization strength choices
Grid-free noisy particle gradient descent optimization
🔎 Similar Papers
No similar papers found.