🤖 AI Summary
This work addresses the challenge of locally adapting hyperparameters—particularly trajectory length—in Hamiltonian Monte Carlo (HMC) when sampling from complex, high-dimensional target distributions. We propose the Gibbs-based Self-Tuning (GIST) framework, which dynamically updates key hyperparameters (e.g., trajectory length) at each HMC iteration via Gibbs sampling conditioned on the current position and momentum. GIST is the first method to integrate Gibbs sampling into HMC hyperparameter adaptation, unifying stochastic HMC, polynomial HMC, and the No-U-Turn Sampler (NUTS) under a single probabilistic framework. It further introduces a novel, locally adaptive trajectory-length strategy that outperforms NUTS. Theoretically, GIST ensures exact Hamiltonian dynamics simulation; empirically, it achieves superior sampling efficiency and convergence stability across diverse models—including ill-conditioned high-dimensional Gaussian targets—consistently surpassing NUTS in benchmark evaluations.
📝 Abstract
We introduce a novel and flexible framework for constructing locally adaptive Hamiltonian Monte Carlo (HMC) samplers by Gibbs sampling the algorithm's tuning parameters conditionally based on the position and momentum at each step. For adaptively sampling path lengths, this framework -- which we call Gibbs self-tuning (GIST) -- encompasses randomized HMC, multinomial HMC, the No-U-Turn Sampler (NUTS), and the Apogee-to-Apogee Path Sampler as special cases. The GIST framework is illustrated with a novel alternative to NUTS for locally adapting path lengths, evaluated with an exact Hamiltonian for a high-dimensional, ill-conditioned Gaussian measure and with the leapfrog integrator for a suite of diverse models.