Feynman-Kac Correctors in Diffusion: Annealing, Guidance, and Product of Experts

📅 2025-03-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of accurately sampling from multi-objective target distributions—such as annealed, geometric mean, or product-of-experts distributions—during inference with pretrained diffusion models. We propose a weighted simulation framework grounded in the Feynman–Kac formula, marking the first systematic integration of this stochastic PDE formalism into diffusion sampling and establishing a theoretically rigorous correction mechanism. We design a scalable sequential Monte Carlo (SMC) resampling algorithm that supports inference-time temperature annealing and ensemble combination across multiple models. The method requires no additional training, is plug-and-play, and unifies score guidance, expert ensemble fusion, and inference-time scaling within a single framework. Experiments demonstrate substantial improvements in multi-objective optimization quality for molecular generation and enhanced classifier-free guidance performance in text-to-image synthesis.

Technology Category

Application Category

📝 Abstract
While score-based generative models are the model of choice across diverse domains, there are limited tools available for controlling inference-time behavior in a principled manner, e.g. for composing multiple pretrained models. Existing classifier-free guidance methods use a simple heuristic to mix conditional and unconditional scores to approximately sample from conditional distributions. However, such methods do not approximate the intermediate distributions, necessitating additional 'corrector' steps. In this work, we provide an efficient and principled method for sampling from a sequence of annealed, geometric-averaged, or product distributions derived from pretrained score-based models. We derive a weighted simulation scheme which we call Feynman-Kac Correctors (FKCs) based on the celebrated Feynman-Kac formula by carefully accounting for terms in the appropriate partial differential equations (PDEs). To simulate these PDEs, we propose Sequential Monte Carlo (SMC) resampling algorithms that leverage inference-time scaling to improve sampling quality. We empirically demonstrate the utility of our methods by proposing amortized sampling via inference-time temperature annealing, improving multi-objective molecule generation using pretrained models, and improving classifier-free guidance for text-to-image generation. Our code is available at https://github.com/martaskrt/fkc-diffusion.
Problem

Research questions and friction points this paper is trying to address.

Control inference-time behavior in score-based generative models.
Sample from conditional distributions using pretrained models.
Improve sampling quality via Sequential Monte Carlo resampling.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Feynman-Kac Correctors for controlled sampling
Sequential Monte Carlo resampling algorithms
Amortized sampling via temperature annealing
🔎 Similar Papers
No similar papers found.