🤖 AI Summary
This work addresses the slow convergence and training instability of Schrödinger Bridge (SB) models. We propose a novel paradigm that integrates pre-trained diffusion models into the SB solving framework. Leveraging three unified reparameterization techniques—Implicit Prior Matching (IPMM), Implicit Prior Transport (IPTM), and Implicit Prior Flow Matching (IPFM)—we achieve, for the first time, transferable initialization of the SB’s initial potential function using pre-trained score models. This strategy significantly improves SB training stability and accelerates convergence, reducing required iterations by 30%–50%. Remarkably, it also enhances the generative quality of the underlying diffusion models, lowering FID scores by 12%–22%. The core contribution lies in establishing a bidirectional synergy between diffusion priors and SB dynamics, offering a principled approach to model initialization and knowledge transfer in generative modeling.
📝 Abstract
This paper aims to unify Score-based Generative Models (SGMs), also known as Diffusion models, and the Schrödinger Bridge (SB) problem through three reparameterization techniques: Iterative Proportional Mean-Matching (IPMM), Iterative Proportional Terminus-Matching (IPTM), and Iterative Proportional Flow-Matching (IPFM). These techniques significantly accelerate and stabilize the training of SB-based models. Furthermore, the paper introduces novel initialization strategies that use pre-trained SGMs to effectively train SB-based models. By using SGMs as initialization, we leverage the advantages of both SB-based models and SGMs, ensuring efficient training of SB-based models and further improving the performance of SGMs. Extensive experiments demonstrate the significant effectiveness and improvements of the proposed methods. We believe this work contributes to and paves the way for future research on generative models.