🤖 AI Summary
This work addresses the persistent bias of constant order in Stein Variational Gradient Descent (SVGD) under finite-particle settings, which hinders accurate approximation of Wasserstein gradient flows. To overcome this limitation, the authors propose Resolvent-regularized SVGD (R-SVGD), introducing a resolvent-type regularized preconditioner to construct both continuous and discrete dynamics. They establish, for the first time, an explicit non-asymptotic convergence bound for a finite number of particles. Under the W₁I condition on the target distribution and for a broad class of smooth kernels, they prove convergence of the time-averaged empirical measure in Wasserstein-1 distance and precisely quantify the trade-off between approximation error and particle estimation error. The theoretical analysis further yields practical guidelines for tuning the regularization parameter, step size, and averaging window.
📝 Abstract
We derive finite-particle rates for the regularized Stein variational gradient descent (R-SVGD) algorithm introduced by He et al. (2024) that corrects the constant-order bias of the SVGD by applying a resolvent-type preconditioner to the kernelized Wasserstein gradient. For the resulting interacting $N$-particle system, we establish explicit non-asymptotic bounds for time-averaged (annealed) empirical measures, illustrating convergence in the \emph{true} (non-kernelized) Fisher information and, under a $\mathrm{W}_1\mathrm{I}$ condition on the target, corresponding $\mathrm{W}_1$ convergence for a large class of smooth kernels. Our analysis covers both continuous- and discrete-time dynamics and yields principled tuning rules for the regularization parameter, step size, and averaging horizon that quantify the trade-off between approximating the Wasserstein gradient flow and controlling finite-particle estimation error.