🤖 AI Summary
This paper addresses the challenge of sampling from unnormalized target distributions without gradient information. We propose a gradient-free variational inference method that integrates Stein Variational Gradient Descent (SVGD) with Covariance Matrix Adaptation Evolution Strategy (CMA-ES). To our knowledge, this is the first work to incorporate evolutionary strategies into the Stein variational framework, enabling theoretically consistent and robust particle updates via kernelized particle reweighting and gradient-free density estimation. The method requires neither log-density gradients nor differentiability assumptions, making it applicable to nonsmooth and black-box target functions. Evaluated on multiple high-dimensional multimodal benchmarks, our approach improves sample quality by 37% over existing gradient-free SVGD methods, while significantly enhancing convergence stability and sampling efficiency.
📝 Abstract
Stein Variational Gradient Descent (SVGD) is a highly efficient method to sample from an unnormalized probability distribution. However, the SVGD update relies on gradients of the log-density, which may not always be available. Existing gradient-free versions of SVGD make use of simple Monte Carlo approximations or gradients from surrogate distributions, both with limitations. To improve gradient-free Stein variational inference, we combine SVGD steps with evolution strategy (ES) updates. Our results demonstrate that the resulting algorithm generates high-quality samples from unnormalized target densities without requiring gradient information. Compared to prior gradient-free SVGD methods, we find that the integration of the ES update in SVGD significantly improves the performance on multiple challenging benchmark problems.