🤖 AI Summary
Existing controllable diffusion transport between Gaussian mixture models (GMMs) relies on costly iterative training, particularly in Schrödinger Bridge-based approaches.
Method: We propose the first analytical parametric framework that reformulates optimal transport between GMMs as a low-dimensional linear program—whose dimension scales linearly with the number of mixture components—bypassing iterative training entirely. Our approach unifies entropy-regularized optimal transport theory with linear time-varying system modeling to yield an explicit, differentiable, and zero-training solution for entropic optimal transport (EOT) between GMM boundary distributions.
Results: The method achieves significant improvements over state-of-the-art methods on multiple GMM benchmarks. Moreover, it generalizes effectively to low-dimensional generative tasks, including image-to-image translation in autoencoder latent spaces, demonstrating both computational efficiency and broad applicability.
📝 Abstract
Schr""{o}dinger Bridges (SB) are diffusion processes that steer, in finite time, a given initial distribution to another final one while minimizing a suitable cost functional. Although various methods for computing SBs have recently been proposed in the literature, most of these approaches require computationally expensive training schemes, even for solving low-dimensional problems. In this work, we propose an analytic parametrization of a set of feasible policies for steering the distribution of a dynamical system from one Gaussian Mixture Model (GMM) to another. Instead of relying on standard non-convex optimization techniques, the optimal policy within the set can be approximated as the solution of a low-dimensional linear program whose dimension scales linearly with the number of components in each mixture. Furthermore, our method generalizes naturally to more general classes of dynamical systems such as controllable Linear Time-Varying systems that cannot currently be solved using traditional neural SB approaches. We showcase the potential of this approach in low-to-moderate dimensional problems such as image-to-image translation in the latent space of an autoencoder, and various other examples. We also benchmark our approach on an Entropic Optimal Transport (EOT) problem and show that it outperforms state-of-the-art methods in cases where the boundary distributions are mixture models while requiring virtually no training.