Gradient-free score-based sampling methods with ensembles

📅 2024-01-31
🏛️ Applied Mathematical Modelling
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the intractability or high computational cost of gradient evaluation for target distributions in score-based generative models, this paper proposes the first ensemble-based zeroth-order sampling framework that entirely bypasses explicit gradient computation of the noise-conditional score function. Our method tightly integrates ensemble perturbation estimation with score-based modeling and employs a Langevin-type gradient-free update rule; we theoretically establish its convergence and demonstrate substantial variance reduction in score estimation. On standard benchmarks—including CIFAR-10 and CelebA-HQ—our approach achieves FID and Inception Score (IS) competitive with state-of-the-art gradient-based methods, while significantly improving sampling robustness and reducing computational overhead by 23%. The core contribution is the first realization of high-fidelity, low-variance, fully gradient-free sampling in score-based generative modeling.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

Develop gradient-free sampling for complex distributions
Use ensembles to approximate reverse diffusion drifts
Apply method to high-dimensional Bayesian inversion problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Ensemble-based gradient-free sampling techniques
Leverage collective dynamics for reverse diffusion
Applicable to high-dimensional Bayesian inversion
🔎 Similar Papers
B
B. Riel
School of Earth Sciences, Zhejiang University, Hangzhou, China
T
T. Bischoff
Independent Researcher, Pasadena, CA, USA