High-dimensional Adaptive MCMC with Reduced Computational Complexity

📅 2026-04-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of high computational cost in dense preconditioning for high-dimensional MCMC sampling, where traditional dense preconditioners incur O(d²) complexity and diagonal alternatives fail to capture strong correlations in the target distribution. The authors propose an adaptive MCMC method that dynamically learns the covariance structure of the target via online principal component analysis and constructs a sparse parametrization of a dense preconditioner as the product of a diagonal matrix and a small number of Householder reflectors. This design preserves the ability to model inter-variable correlations while reducing per-iteration complexity to O(m²d), where m ≪ d is a user-specified rank. Empirical results demonstrate that the proposed approach achieves higher absolute sampling efficiency than diagonal preconditioning and outperforms both standard dense and various diagonal-plus-low-rank alternatives in time-normalized performance.

Technology Category

Application Category

📝 Abstract
We propose an adaptive MCMC method that learns a linear preconditioner which is dense in its off-diagonal elements but sparse in its parametrisation. Due to this sparsity, we achieve a per-iteration computational complexity of $O(m^2d)$ for a user-determined parameter $m$, compared with the $O(d^2)$ complexity of existing adaptive strategies that can capture correlation information from the target. Diagonal preconditioning has an $O(d)$ per-iteration complexity, but is known to fail in the case that the target distribution is highly correlated, see \citet[Section 3.5]{hird2025a}. Our preconditioner is constructed using eigeninformation from the target covariance which we infer using online principal components analysis on the MCMC chain. It is composed of a diagonal matrix and a product of carefully chosen reflection matrices. On various numerical tests we show that it outperforms diagonal preconditioning in terms of absolute performance, and that it outperforms traditional dense preconditioning and multiple diagonal plus low-rank alternatives in terms of time-normalised performance.
Problem

Research questions and friction points this paper is trying to address.

high-dimensional
adaptive MCMC
computational complexity
preconditioning
correlated targets
Innovation

Methods, ideas, or system contributions that make the work stand out.

adaptive MCMC
preconditioning
online PCA
computational complexity
high-dimensional sampling
🔎 Similar Papers
No similar papers found.
M
Max Hird
Department of Statistics and Actuarial Science, University of Waterloo
Samuel Livingstone
Samuel Livingstone
Associate professor in mathematical statistics, University College London
StatisticsProbability