Horseshoe Priors and MDP

📅 2026-04-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the problem of super-efficient convergence and tail robustness of Bayesian predictive densities in high-dimensional sparse settings. By establishing a theoretical link between the logarithmic-pole singularity of the horseshoe prior near the origin and its finite-sample properties together with the moderate deviations principle (MDP), the work constructs, for the first time, a unified information-theoretic framework grounded in a log-budget principle. The key contributions include uncovering an intrinsic unifying mechanism underlying the horseshoe prior’s sparsity adaptation, super-efficiency, and tail robustness; deriving the MDP critical threshold $t_{\text{crit}} = \sqrt{\log(\pi n/2)}$; and proving that its asymptotic Bayes risk equals $p_0 \log(p/p_0)/n$, thereby achieving asymptotic Bayes optimality under sparsity (ABOS).
📝 Abstract
Carvalho (2010) established two foundational theorems for the horseshoe prior: tight two-sided logarithmic bounds on the marginal density near the origin (Theorem~1.1), and a super-efficient rate of convergence of the Bayes predictive density to the true sampling density in sparse situations (Theorem~2). The ``Shrink Globally, Act Locally'' paper \citep{polson2010shrink} formalised necessary and sufficient conditions on the prior's behaviour at the origin for sparsity adaptation as $p \to \infty$. We show that these results are not merely descriptive properties of the horseshoe -- they are the finite-sample precursors to the asymptotic moderate deviation principle (MDP) of \citet{datta2026newlook}. The log-pole singularity $\piH(θ) \asymp -\log\absθ$ is precisely the origin integrability boundary that selects the MDP threshold $\tcrit = \sqrt{\log(πn/2)}$; super-efficiency below the threshold and tail robustness above it together produce the ABOS Bayes risk $p_0 \log(p/p_0)/n$; and the Clarke--Barron information-theoretic asymptotics of Bayes methods provide the unifying framework in which all three results are faces of a single logarithmic budget principle.
Problem

Research questions and friction points this paper is trying to address.

Horseshoe Prior
Sparsity
Moderate Deviation Principle
Bayes Risk
Super-efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Horseshoe Prior
Moderate Deviation Principle
Sparsity Adaptation
Super-efficiency
Logarithmic Budget Principle
🔎 Similar Papers
No similar papers found.