🤖 AI Summary
This work addresses the open problem of non-asymptotic convergence of Iterative Markov Fitting (IMF) for the Schrödinger bridge problem. We establish, for the first time, a rigorous exponential convergence guarantee under mild structural assumptions—such as log-concavity or weak log-concavity—on the reference measure and marginal distributions. To overcome limitations of classical asymptotic analysis, we introduce a novel contraction framework based on Markov projection operators. Our theoretical analysis proves that IMF converges exponentially in finite time, covering core diffusion-based Schrödinger bridge settings and providing the first non-asymptotic convergence guarantee for practical algorithms like Diffusion Schrödinger Bridge Methods (DSBM). This result fills a critical theoretical gap in IMF, bridging optimal transport, generative modeling, and iterative fitting through unified, non-asymptotic analysis.
📝 Abstract
The Schrödinger Bridge (SB) problem has become a fundamental tool in computational optimal transport and generative modeling. To address this problem, ideal methods such as Iterative Proportional Fitting and Iterative Markovian Fitting (IMF) have been proposed-alongside practical approximations like Diffusion Schrödinger Bridge and its Matching (DSBM) variant. While previous work have established asymptotic convergence guarantees for IMF, a quantitative, non-asymptotic understanding remains unknown. In this paper, we provide the first non-asymptotic exponential convergence guarantees for IMF under mild structural assumptions on the reference measure and marginal distributions, assuming a sufficiently large time horizon. Our results encompass two key regimes: one where the marginals are log-concave, and another where they are weakly log-concave. The analysis relies on new contraction results for the Markovian projection operator and paves the way to theoretical guarantees for DSBM.