Variational Entropic Optimal Transport

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the computational inefficiency in entropy-regularized optimal transport (EOT) over continuous spaces, which arises from the intractability of the log-partition function. The authors propose a variational reformulation that yields, for the first time, an exact variational representation of this partition function, recasting it as a differentiable minimization problem over auxiliary normalized variables. This approach eliminates the need for conventional assumptions such as Gaussian mixture approximations or reliance on MCMC sampling, thereby enabling end-to-end training. By integrating neural function approximation with stochastic gradient optimization, the method yields an efficient and scalable differentiable EOT solver. Experiments on both synthetic data and unpaired image-to-image translation tasks demonstrate that the proposed framework matches or exceeds the performance of existing methods, confirming its effectiveness.

Technology Category

Application Category

📝 Abstract
Entropic optimal transport (EOT) in continuous spaces with quadratic cost is a classical tool for solving the domain translation problem. In practice, recent approaches optimize a weak dual EOT objective depending on a single potential, but doing so is computationally not efficient due to the intractable log-partition term. Existing methods typically resolve this obstacle in one of two ways: by significantly restricting the transport family to obtain closed-form normalization (via Gaussian-mixture parameterizations), or by using general neural parameterizations that require simulation-based training procedures. We propose Variational Entropic Optimal Transport (VarEOT), based on an exact variational reformulation of the log-partition $\log \mathbb{E}[\exp(\cdot)]$ as a tractable minimization over an auxiliary positive normalizer. This yields a differentiable learning objective optimized with stochastic gradients and avoids the necessity of MCMC simulations during the training. We provide theoretical guarantees, including finite-sample generalization bounds and approximation results under universal function approximation. Experiments on synthetic data and unpaired image-to-image translation demonstrate competitive or improved translation quality, while comparisons within the solvers that use the same weak dual EOT objective support the benefit of the proposed optimization principle.
Problem

Research questions and friction points this paper is trying to address.

Entropic Optimal Transport
Domain Translation
Log-partition Function
Computational Efficiency
Continuous Spaces
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variational Entropic Optimal Transport
log-partition variational reformulation
differentiable optimization
stochastic gradient training
unpaired image-to-image translation
🔎 Similar Papers