🤖 AI Summary
This paper addresses the weak causal interpretability of the Kitagawa–Oaxaca–Blinder (KOB) decomposition under common support deficiency and model misspecification. Building upon the potential outcomes framework, we reformulate KOB by introducing a weighted reference outcome specification and integrating double-robust estimation with Neyman orthogonality. The resulting decomposition is implemented via double machine learning, enabling nonparametric, truncation-robust, and extrapolation-robust decomposition of mean differences. Our approach alleviates the conventional KOB’s strong reliance on overlap and parametric functional forms, substantially mitigating sensitivity to irrelevant covariates. In two empirical applications, the proposed method effectively overcomes insufficient support overlap, yielding more robust and causally interpretable decompositions. It thus provides a more reliable technical pathway for causal attribution of intergroup mean differences.
📝 Abstract
The Kitagawa-Oaxaca-Blinder decomposition splits the difference in means between two groups into an explained part, due to observable factors, and an unexplained part. In this paper, we reformulate this framework using potential outcomes, highlighting the critical role of the reference outcome. To address limitations like common support and model misspecification, we extend Neumark's (1988) weighted reference approach with a doubly robust estimator. Using Neyman orthogonality and double machine learning, our method avoids trimming and extrapolation. This improves flexibility and robustness, as illustrated by two empirical applications. Nevertheless, we also highlight that the decomposition based on the Neumark reference outcome is particularly sensitive to the inclusion of irrelevant explanatory variables.