🤖 AI Summary
Existing denoising Markov generative models lack a universal, rigorous theoretical framework.
Method: We establish a unified mathematical foundation grounded in nonequilibrium statistical mechanics and the generalized Doob h-transform, rigorously characterizing the duality between arbitrary Lévy-type forward processes and their reversible backward generators.
Contribution/Results: We introduce the minimal complete set of assumptions enabling explicit construction of backward generators, unified variational objective design, and a universal generalization of score matching—subsuming and extending both continuous- and discrete-time diffusion models. By integrating measure transport, jump processes, and geometric Brownian motion, we formulate and empirically validate two novel denoising models. These exhibit enhanced flexibility and practicality in modeling complex distributions, advancing the theoretical and methodological scope of generative modeling.
📝 Abstract
Probabilistic generative models based on measure transport, such as diffusion and flow-based models, are often formulated in the language of Markovian stochastic dynamics, where the choice of the underlying process impacts both algorithmic design choices and theoretical analysis. In this paper, we aim to establish a rigorous mathematical foundation for denoising Markov models, a broad class of generative models that postulate a forward process transitioning from the target distribution to a simple, easy-to-sample distribution, alongside a backward process particularly constructed to enable efficient sampling in the reverse direction. Leveraging deep connections with nonequilibrium statistical mechanics and generalized Doob's $h$-transform, we propose a minimal set of assumptions that ensure: (1) explicit construction of the backward generator, (2) a unified variational objective directly minimizing the measure transport discrepancy, and (3) adaptations of the classical score-matching approach across diverse dynamics. Our framework unifies existing formulations of continuous and discrete diffusion models, identifies the most general form of denoising Markov models under certain regularity assumptions on forward generators, and provides a systematic recipe for designing denoising Markov models driven by arbitrary L'evy-type processes. We illustrate the versatility and practical effectiveness of our approach through novel denoising Markov models employing geometric Brownian motion and jump processes as forward dynamics, highlighting the framework's potential flexibility and capability in modeling complex distributions.