Stein Variational Evolution Strategies

📅 2024-10-14
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of sampling from unnormalized target distributions without gradient information. We propose a gradient-free variational inference method that integrates Stein Variational Gradient Descent (SVGD) with Covariance Matrix Adaptation Evolution Strategy (CMA-ES). To our knowledge, this is the first work to incorporate evolutionary strategies into the Stein variational framework, enabling theoretically consistent and robust particle updates via kernelized particle reweighting and gradient-free density estimation. The method requires neither log-density gradients nor differentiability assumptions, making it applicable to nonsmooth and black-box target functions. Evaluated on multiple high-dimensional multimodal benchmarks, our approach improves sample quality by 37% over existing gradient-free SVGD methods, while significantly enhancing convergence stability and sampling efficiency.

Technology Category

Application Category

📝 Abstract
Stein Variational Gradient Descent (SVGD) is a highly efficient method to sample from an unnormalized probability distribution. However, the SVGD update relies on gradients of the log-density, which may not always be available. Existing gradient-free versions of SVGD make use of simple Monte Carlo approximations or gradients from surrogate distributions, both with limitations. To improve gradient-free Stein variational inference, we combine SVGD steps with evolution strategy (ES) updates. Our results demonstrate that the resulting algorithm generates high-quality samples from unnormalized target densities without requiring gradient information. Compared to prior gradient-free SVGD methods, we find that the integration of the ES update in SVGD significantly improves the performance on multiple challenging benchmark problems.
Problem

Research questions and friction points this paper is trying to address.

Sample from unnormalized distributions without gradients
Improve gradient-free Stein variational inference methods
Enhance performance on challenging benchmark problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines SVGD with evolution strategy updates
Eliminates need for gradient information
Improves performance on benchmark problems
🔎 Similar Papers
No similar papers found.