Denoising Diffusion Variational Inference: Diffusion Models as Expressive Variational Posteriors

📅 2024-01-05
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of posterior inference in deep latent variable models—particularly in biological applications such as human genomic ancestry inference—by proposing Diffusion-based Deep Variational Inference (DDVI), a black-box variational inference framework grounded in diffusion processes. DDVI directly models a learnable diffusion process as an iterative refinement of the variational posterior in latent space, marking the first use of diffusion models as highly flexible and expressive posterior approximators. To balance inference fidelity and training stability, DDVI introduces a wake-sleep-inspired regularized ELBO objective. Evaluated on standard benchmarks and the 1000 Genomes Project dataset for ancestry inference, DDVI significantly outperforms strong baselines—including normalizing flows and adversarial posteriors—in both inference accuracy and model learning efficacy.

Technology Category

Application Category

📝 Abstract
We propose denoising diffusion variational inference (DDVI), a black-box variational inference algorithm for latent variable models which relies on diffusion models as flexible approximate posteriors. Specifically, our method introduces an expressive class of diffusion-based variational posteriors that perform iterative refinement in latent space; we train these posteriors with a novel regularized evidence lower bound (ELBO) on the marginal likelihood inspired by the wake-sleep algorithm. Our method is easy to implement (it fits a regularized extension of the ELBO), is compatible with black-box variational inference, and outperforms alternative classes of approximate posteriors based on normalizing flows or adversarial networks. We find that DDVI improves inference and learning in deep latent variable models across common benchmarks as well as on a motivating task in biology -- inferring latent ancestry from human genomes -- where it outperforms strong baselines on the Thousand Genomes dataset.
Problem

Research questions and friction points this paper is trying to address.

Develops diffusion-based variational posteriors for latent variable models
Introduces regularized ELBO for training expressive approximate posteriors
Improves inference in deep latent variable models and biological tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses diffusion models as variational posteriors
Trains with regularized ELBO inspired wake-sleep
Outperforms normalizing flows adversarial networks
🔎 Similar Papers
No similar papers found.