🤖 AI Summary
This study addresses challenges in primary data analysis arising from covariate shift and outcome coarsening by proposing an empirical likelihood–based fusion method that incorporates summary-level predictions from external black-box machine learning models into multinomial logistic regression through moment constraints. The approach enhances estimation efficiency and robustness under mild conditions without requiring explicit modeling of density ratios. Theoretical results establish the consistency and asymptotic normality of the fused estimator. Simulations and an empirical application to multinomial blood pressure classification using NHANES data demonstrate substantially improved statistical inference efficiency compared to conventional methods relying solely on primary data. This work is the first to integrate predictive information from black-box models into interpretable models via rich and robust moment conditions, providing rigorous theoretical guarantees for strict efficiency gains.
📝 Abstract
In many modern applications, a carefully designed primary study provides individual-level data for interpretable modeling, while summary-level external information is available through black-box, efficient, and nonparametric machine-learning predictions. Although summary-level external information has been studied in the data integration literature, there is limited methodology for leveraging external nonparametric machine-learning predictions to improve statistical inference in the primary study. We propose a general empirical-likelihood framework that incorporates external predictions through moment constraints. An advantage of nonparametric machine-learning prediction is that it induces a rich class of valid moment restrictions that remain robust to covariate shift under a mild overlap condition without requiring explicit density-ratio modeling. We focus on multinomial logistic regression as the primary model and address common data-quality issues in external sources, including coarsened outcomes, partially observed covariates, covariate shift, and heterogeneity in generating mechanisms known as concept shift. We establish large-sample properties of the resulting fused estimator, including consistency and asymptotic normality under regularity conditions. Moreover, we provide mild sufficient conditions under which incorporating external predictions delivers a strict efficiency gain relative to the primary-only estimator. Simulation studies and an application to the National Health and Nutrition Examination Survey on multiclass blood-pressure classification.