Refined Cluster Robust Inference

📅 2026-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the distortion in inference that arises when the number of clusters is small, a setting in which the conventional normal approximation to the t-statistic performs poorly. The authors develop a higher-order correction to critical values based on a conditional Cramér–Edgeworth expansion, enabling third-order accurate inference that is robust to both cluster dependence and heterogeneity. The proposed method is agnostic to the nature of regressors and does not require resampling techniques such as cluster bootstrap, thereby offering substantial improvements in size control for small samples. Simulation evidence demonstrates that the approach markedly outperforms existing methods—even with as few as ten clusters—while maintaining computational simplicity and theoretical rigor.

Technology Category

Application Category

📝 Abstract
It has become standard for empirical studies to conduct inference robust to cluster dependence and heterogeneity. With a small number of clusters, the normal approximation for the $t$-statistics of regression coefficients may be poor. This paper tackles this problem using a critical value based on the conditional Cramér-Edgeworth expansion for the $t$-statistics. Our approach guarantees third-order refinement, regardless of whether a regressor is discrete or not, and, unlike the cluster pairs bootstrap, avoids resampling data. Simulations show that our proposal can make a difference in size control with as few as 10 clusters.
Problem

Research questions and friction points this paper is trying to address.

cluster robust inference
small number of clusters
t-statistics
normal approximation
inference accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

cluster robust inference
Cramér-Edgeworth expansion
small number of clusters
third-order refinement
t-statistics