Monge-Kantorovich Fitting With Sobolev Budgets

📅 2024-09-25
🏛️ arXiv.org
📈 Citations: 1
Influential: 1
📄 PDF
🤖 AI Summary
This work addresses the problem of Wasserstein-$p$ optimal approximation of high-dimensional probability measures by measures supported on low-dimensional subspaces, with Sobolev norms ($W^{k,q}$) explicitly constraining the complexity of transport maps—thereby unifying manifold learning and regularized generative modeling. Methodologically, it innovatively incorporates Sobolev norms into an optimal transport–driven dimensionality reduction framework, interpreting them as a “complexity budget” for transport maps. It introduces the barycenter field to characterize the gradient of the objective functional and establishes a near-strict monotonicity theory for the objective under higher-order differentiability assumptions. Furthermore, it constructs a natural discretization scheme with provable discrete consistency. The contributions provide a novel theoretical foundation for the role of regularization in generative models and extend the applicability of optimal transport to structured, regularized dimensionality reduction.

Technology Category

Application Category

📝 Abstract
Given $m<n$, we consider the problem of ``best'' approximating an $n ext{-d}$ probability measure $ ho$ via an $m ext{-d}$ measure $ u$ such that $mathrm{supp} u$ has bounded total ``complexity.'' When $ ho$ is concentrated near an $m ext{-d}$ set we may interpret this as a manifold learning problem with noisy data. However, we do not restrict our analysis to this case, as the more general formulation has broader applications. We quantify $ u$'s performance in approximating $ ho$ via the Monge-Kantorovich (also called Wasserstein) $p$-cost $mathbb{W}_p^p( ho, u)$, and constrain the complexity by requiring $mathrm{supp} u$ to be coverable by an $f : mathbb{R}^{m} o mathbb{R}^{n}$ whose $W^{k,q}$ Sobolev norm is bounded by $ell geq 0$. This allows us to reformulate the problem as minimizing a functional $mathscr J_p(f)$ under the Sobolev ``budget'' $ell$. This problem is closely related to (but distinct from) principal curves with length constraints when $m=1, k = 1$ and an unsupervised analogue of smoothing splines when $k>1$. New challenges arise from the higher-order differentiability condition. We study the ``gradient'' of $mathscr J_p$, which is given by a certain vector field that we call the barycenter field, and use it to prove a nontrivial (almost) strict monotonicity result. We also provide a natural discretization scheme and establish its consistency. We use this scheme as a toy model for a generative learning task, and by analogy, propose novel interpretations for the role regularization plays in improving training.
Problem

Research questions and friction points this paper is trying to address.

Approximate high-dimensional probability measures with lower-dimensional ones
Optimize approximation using Monge-Kantorovich cost under Sobolev constraints
Study gradient behavior and propose discretization for generative learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Monge-Kantorovich fitting for approximation
Applies Sobolev budgets to constrain complexity
Introduces barycenter field for gradient analysis
🔎 Similar Papers
No similar papers found.