🤖 AI Summary
Conventional Gaussian empirical Bayes methods rely on the strong and empirically refuted assumption that parameters are independent of their standard errors—a premise lacking theoretical justification. Method: We propose CLOSE, the first systematic framework for modeling conditional location-scale families under precision dependence, directly characterizing the parameter’s conditional distribution given its standard error to enable adaptive shrinkage estimation. Contribution/Results: CLOSE unifies and generalizes multiple existing approaches; its most flexible variant achieves model parsimony, computational efficiency, and minimax-optimal decision regret. In a U.S. Census application—selecting high-mobility communities—CLOSE substantially improves selection accuracy and outperforms all benchmark methods predicated on the independence-of-precision assumption.
📝 Abstract
Gaussian empirical Bayes methods usually maintain a precision independence assumption: The unknown parameters of interest are independent from the known standard errors of the estimates. This assumption is often theoretically questionable and empirically rejected. This paper proposes to model the conditional distribution of the parameter given the standard errors as a flexibly parametrized location-scale family of distributions, leading to a family of methods that we call CLOSE. The CLOSE framework unifies and generalizes several proposals under precision dependence. We argue that the most flexible member of the CLOSE family is a minimalist and computationally efficient default for accounting for precision dependence. We analyze this method and show that it is competitive in terms of the regret of subsequent decisions rules. Empirically, using CLOSE leads to sizable gains for selecting high-mobility Census tracts.