🤖 AI Summary
This work addresses efficient uniform sampling from high-dimensional convex bodies, providing strong convergence guarantees under Rényi divergence—including total variation (TV), Wasserstein-2 ($mathcal{W}_2$), Kullback–Leibler (KL), and $chi^2$ divergences. We propose a novel stochastic walk algorithm that models sampling through the lens of stochastic diffusion—a first in this context—and characterizes convergence rates via the functional isoperimetric constant of the target distribution, thereby departing from conventional polynomial mixing-time analysis. Theoretically, our algorithm achieves the optimal time complexity $O^*(n^2 R^2)$, where $n$ denotes dimensionality and $R$ the body’s diameter. Moreover, we derive unified, tight convergence bounds across the entire Rényi divergence family. To our knowledge, this is the first uniform sampling scheme attaining simultaneous optimality under multiple probability metrics.
📝 Abstract
We present a new random walk for uniformly sampling high-dimensional convex bodies. It achieves state-of-the-art runtime complexity with stronger guarantees on the output than previously known, namely in R'enyi divergence (which implies TV, $mathcal{W}_2$, KL, $chi^2$). The proof departs from known approaches for polytime algorithms for the problem -- we utilize a stochastic diffusion perspective to show contraction to the target distribution with the rate of convergence determined by functional isoperimetric constants of the target distribution.