🤖 AI Summary
This work addresses the challenge of modeling joint optimal transport (OT) between multiple conditional distributions under continuous conditioning variables—particularly when the number of conditions is large and samples per condition are sparse. We propose a conditional generative framework grounded in flow matching and OT theory. Its core innovation is a novel loss function that jointly optimizes transport maps across infinitely many pairs of conditional distributions, with theoretical guarantees of convergence. We further introduce continuous conditional embedding and jointly parameterized transport mappings to enable all-to-all conditional transfer. Extensive experiments on synthetic data, standard benchmarks (MNIST, CIFAR-10), and cheminformatics datasets with property-based conditioning demonstrate that our method significantly improves cross-condition generation fidelity and transfer quality in low-data regimes, outperforming state-of-the-art conditional OT and conditional generative models.
📝 Abstract
In this paper, we propose a flow-based method for learning all-to-all transfer maps among conditional distributions, approximating pairwise optimal transport. The proposed method addresses the challenge of handling continuous conditions, which often involve a large set of conditions with sparse empirical observations per condition. We introduce a novel cost function that enables simultaneous learning of optimal transports for all pairs of conditional distributions. Our method is supported by a theoretical guarantee that, in the limit, it converges to pairwise optimal transports among infinite pairs of conditional distributions. The learned transport maps are subsequently used to couple data points in conditional flow matching. We demonstrate the effectiveness of this method on synthetic and benchmark datasets, as well as on chemical datasets where continuous physical properties are defined as conditions.