Identification-aware Markov chain Monte Carlo

📅 2025-11-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Non-identifiability in statistical models induces posterior multimodality or flatness due to parameter equivalence classes, causing conventional MCMC methods to suffer from slow convergence or failure. To address this, we propose a novel MCMC framework that incorporates identifiability-inducing prior information by explicitly modeling observationally equivalent parameter sets, thereby guiding samplers to escape local modes and traverse flat posterior regions. Our approach systematically enhances both random-walk Metropolis–Hastings and Hamiltonian Monte Carlo (HMC), significantly improving mixing efficiency and convergence speed in high-dimensional and strongly multimodal settings. Simulation studies demonstrate superior sampling efficiency and mode coverage compared to standard HMC and sequential Monte Carlo. In an empirical application to structural vector moving average (SVMA) models—long considered computationally intractable for Bayesian inference—our method uncovers the model’s intricate multimodal posterior structure for the first time, effectively overcoming a longstanding computational bottleneck in Bayesian inference for non-identifiable models.

Technology Category

Application Category

📝 Abstract
Leaving posterior sensitivity concerns aside, non-identifiability of the parameters does not raise a difficulty for Bayesian inference as far as the posterior is proper, but multi-modality or flat regions of the posterior induced by the lack of identification leaves a challenge for modern Bayesian computation. Sampling methods often struggle with slow or non-convergence when dealing with multiple modes or flat regions of the target distributions. This paper develops a novel Markov chain Monte Carlo (MCMC) approach for non-identified models, leveraging the knowledge of observationally equivalent sets of parameters, and highlights an important role that identification plays in modern Bayesian analysis. We show that our proposal overcomes the issues of being trapped in a local mode and achieves a faster rate of convergence than the existing MCMC techniques including random walk Metropolis-Hastings and Hamiltonian Monte Carlo. The gain in the speed of convergence is more significant as the dimension or cardinality of the identified sets increases. Simulation studies show its superior performance compared to other popular computational methods including Hamiltonian Monte Carlo and sequential Monte Carlo. We also demonstrate that our method uncovers non-trivial modes in the target distribution in a structural vector moving-average (SVMA) application.
Problem

Research questions and friction points this paper is trying to address.

Addresses slow convergence in non-identified Bayesian models
Overcomes multimodality issues in MCMC sampling methods
Improves performance for high-dimensional parameter spaces
Innovation

Methods, ideas, or system contributions that make the work stand out.

Identification-aware MCMC for non-identified models
Leverages observationally equivalent parameter sets
Overcomes local mode trapping and improves convergence
🔎 Similar Papers
No similar papers found.