Riemannian Neural Optimal Transport

📅 2026-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural optimal transport methods are largely confined to Euclidean spaces and rely on discretization, suffering from the curse of dimensionality when extended to high-dimensional Riemannian manifolds. This work proposes the first continuous neural parameterization framework that directly constructs optimal transport maps on Riemannian manifolds, inherently respecting their geometric structure and circumventing discretization. By employing a geometry-aware neural network architecture, the method enables continuous modeling of Riemannian optimal transport maps with approximation complexity growing sub-exponentially in dimension, thereby significantly mitigating the curse of dimensionality. Experiments demonstrate that the approach consistently outperforms existing discretization-based baselines on both synthetic and real-world datasets, achieving substantial improvements in scalability and performance.

Technology Category

Application Category

📝 Abstract
Computational optimal transport (OT) offers a principled framework for generative modeling. Neural OT methods, which use neural networks to learn an OT map (or potential) from data in an amortized way, can be evaluated out of sample after training, but existing approaches are tailored to Euclidean geometry. Extending neural OT to high-dimensional Riemannian manifolds remains an open challenge. In this paper, we prove that any method for OT on manifolds that produces discrete approximations of transport maps necessarily suffers from the curse of dimensionality: achieving a fixed accuracy requires a number of parameters that grows exponentially with the manifold dimension. Motivated by this limitation, we introduce Riemannian Neural OT (RNOT) maps, which are continuous neural-network parameterizations of OT maps on manifolds that avoid discretization and incorporate geometric structure by construction. Under mild regularity assumptions, we prove that RNOT maps approximate Riemannian OT maps with sub-exponential complexity in the dimension. Experiments on synthetic and real datasets demonstrate improved scalability and competitive performance relative to discretization-based baselines.
Problem

Research questions and friction points this paper is trying to address.

Riemannian manifolds
optimal transport
curse of dimensionality
neural networks
generative modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Riemannian Neural Optimal Transport
curse of dimensionality
continuous transport maps
geometric deep learning
optimal transport on manifolds
🔎 Similar Papers
No similar papers found.