๐ค AI Summary
This work addresses the challenge of developing a modality-agnostic, unified generative modeling framework that generalizes and unifies Markovian generative approaches. We propose Generator Matchingโa principled framework grounded in arbitrary Markov processes (including continuous diffusion, flow, discrete transition, and jump processes)โwhich models data distributions by rigorously aligning conditional and marginal generators. Our contributions are threefold: (i) the first unified treatment of diffusion models, flow matching, and discrete diffusion under a single theoretical umbrella; (ii) the first systematic extension of generative modeling to non-standard jump processes; and (iii) support for rigorous superposition of Markov generators and joint multimodal modeling. Experiments demonstrate substantial performance gains on image and multimodal generation tasks, with superposed jump processes delivering significant empirical improvements.
๐ Abstract
We introduce Generator Matching, a modality-agnostic framework for generative modeling using arbitrary Markov processes. Generators characterize the infinitesimal evolution of a Markov process, which we leverage for generative modeling in a similar vein to flow matching: we construct conditional generators which generate single data points, then learn to approximate the marginal generator which generates the full data distribution. We show that Generator Matching unifies various generative modeling methods, including diffusion models, flow matching and discrete diffusion models. Furthermore, it expands the design space to new and unexplored Markov processes such as jump processes. Finally, Generator Matching enables the construction of superpositions of Markov generative models and enables the construction of multimodal models in a rigorous manner. We empirically validate our method on image and multimodal generation, e.g. showing that superposition with a jump process improves performance.