🤖 AI Summary
This work addresses efficient sampling from strongly log-concave distributions supported on convex compact sets. We propose a general proximal sampling framework that adapts unconstrained samplers—such as stochastic gradient MCMC—to constrained domains via Euclidean or canonical projections. To our knowledge, this is the first systematic development of a proximal sampling paradigm supporting multiple projection operators; we further design an efficient canonical projection algorithm based on membership queries, substantially reducing projection overhead. Theoretically, we derive non-asymptotic Wasserstein-1 and Wasserstein-2 error bounds, rigorously establishing that the framework preserves the original sampler’s convergence rate under constraints. Empirically, we validate the framework’s effectiveness and compatibility across diverse convex compact domains, including polytopes, norm balls, and spectrahedra.
📝 Abstract
In this paper, we explore sampling from strongly log-concave distributions defined on convex and compact supports. We propose a general proximal framework that involves projecting onto the constrained set, which is highly flexible and supports various projection options. Specifically, we consider the cases of Euclidean and Gauge projections, with the latter having the advantage of being performed efficiently using a membership oracle. This framework can be seamlessly integrated with multiple sampling methods. Our analysis focuses on Langevin-type sampling algorithms within the context of constrained sampling. We provide nonasymptotic upper bounds on the W1 and W2 errors, offering a detailed comparison of the performance of these methods in constrained sampling.