🤖 AI Summary
Existing generative models lack probabilistic semantics in their sampling procedures. Method: We formulate sample generation as Bayesian posterior inference: starting from a broad prior, high-quality samples are obtained by iteratively alternating between prediction and Bayesian updating steps to progressively refine the posterior distribution. We employ Gaussian posterior approximations, a sequential prediction–update mechanism, and variational inference. Contribution/Results: This work is the first to systematically recast generative modeling within a Bayesian inference framework over latent sample variables, unifying diffusion models and Bayesian Flow Networks (BFNs); we rigorously prove that BFNs constitute a special case of our framework. Experiments on CIFAR-10 and ImageNet demonstrate significant improvements in log-likelihood over BFNs and variational diffusion models, achieving state-of-the-art performance.
📝 Abstract
We derive a novel generative model from the simple act of Gaussian posterior inference. Treating the generated sample as an unknown variable to infer lets us formulate the sampling process in the language of Bayesian probability. Our model uses a sequence of prediction and posterior update steps to narrow down the unknown sample from a broad initial belief. In addition to a rigorous theoretical analysis, we establish a connection between our model and diffusion models and show that it includes Bayesian Flow Networks (BFNs) as a special case. In our experiments, we demonstrate improved performance over both BFNs and Variational Diffusion Models, achieving competitive likelihood scores on CIFAR10 and ImageNet.