On the Posterior Computation Under the Dirichlet-Laplace Prior

📅 2025-07-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional Bayesian inference, Gibbs sampling implementations of the Dirichlet–Laplace (DL) prior suffer from systematic bias due to ambiguities in a critical step, causing empirical samples to deviate from the target posterior and undermining its asymptotic shrinkage guarantees. Method: We provide the first rigorous characterization of the conditional posterior structure under the DL prior, identify and resolve implicit sampling ambiguities in the original algorithm, and propose an exact Gibbs sampler provably convergent to the correct posterior. The new scheme preserves computational efficiency while ensuring theoretical correctness for both the normal means model and high-dimensional linear regression. Results: Simulation and real-data experiments demonstrate that the corrected method substantially improves posterior distributional accuracy and variable selection consistency, thereby ensuring faithful finite-sample implementation of the DL prior’s asymptotic theoretical properties.

Technology Category

Application Category

📝 Abstract
Modern applications routinely collect high-dimensional data, leading to statistical models having more parameters than there are samples available. A common solution is to impose sparsity in parameter estimation, often using penalized optimization methods. Bayesian approaches provide a probabilistic framework to formally quantify uncertainty through shrinkage priors. Among these, the Dirichlet-Laplace prior has attained recognition for its theoretical guarantees and wide applicability. This article identifies a critical yet overlooked issue in the implementation of Gibbs sampling algorithms for such priors. We demonstrate that ambiguities in the presentation of key algorithmic steps, while mathematically coherent, have led to widespread implementation inaccuracies that fail to target the intended posterior distribution -- a target endowed with rigorous asymptotic guarantees. Using the normal-means problem and high-dimensional linear regressions as canonical examples, we clarify these implementation pitfalls and their practical consequences and propose corrected and more efficient sampling procedures.
Problem

Research questions and friction points this paper is trying to address.

Addresses inaccuracies in Gibbs sampling for Dirichlet-Laplace priors
Clarifies implementation pitfalls in high-dimensional Bayesian models
Proposes corrected sampling procedures for accurate posterior computation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Corrects Gibbs sampling for Dirichlet-Laplace prior
Addresses implementation pitfalls in posterior computation
Proposes efficient sampling for high-dimensional models
🔎 Similar Papers
No similar papers found.
P
Paolo Onorati
Department of Statistical Sciences, University of Padova, Italy
D
David B. Dunson
Department of Statistical Science, Duke University
Antonio Canale
Antonio Canale
Associate professor, University of Padova
Bayesian nonparametricsFunctional Data AnalysisFlexible distributions