Bregman projection for calibration estimation

📅 2026-03-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the efficient use of auxiliary information to enhance calibration estimation in survey sampling and data fusion. The authors propose a unified framework based on Bregman divergences, interpreting calibration as a Bregman projection of the weight vector onto constraints defined by auxiliary variables, and establish its equivalence to debiased regression estimation. This geometric perspective not only unifies classical calibration approaches but also yields an optimal contrast entropy divergence, which is extended to settings with unknown inclusion probabilities and high-dimensional auxiliary variables. Leveraging dual representations, asymptotic analysis, cross-fitting, and regularization techniques, the proposed estimator is shown to achieve root-n consistency. Simulation studies and empirical analyses demonstrate that the new method substantially outperforms conventional calibration estimators in both efficiency and stability.

Technology Category

Application Category

📝 Abstract
Calibration weighting is a fundamental technique in survey sampling and data integration for incorporating auxiliary information and improving efficiency of estimators. Classical calibration methods are typically formulated through distance functions applied to weight ratios relative to design weights. In this paper we develop a unified framework for calibration estimation based on Bregman divergence defined directly on the weight vector. We show that calibration estimators obtained from Bregman divergence admit a dual representation that depends only on the dimension of the auxiliary variables and can be interpreted as a Bregman projection onto the calibration constraint set. This geometric structure leads to a general asymptotic representation showing that calibration estimators are equivalent to debiased regression estimators whose regression coefficient depends on the choice of the Bregman generator. The result provides a unifying perspective on classical calibration methods such as quadratic calibration and exponential tilting, and reveals how the choice of divergence influences efficiency. Under Poisson sampling we further characterize the generator that minimizes the asymptotic variance of the calibration estimator and obtain an optimal contrast entropy divergence. The framework also extends naturally to settings where inclusion probabilities are unknown and must be estimated, yielding cross-fitted estimators that remain root-n consistent under mild conditions. Finally, we develop a regularized calibration estimator suitable for high-dimensional auxiliary variables. Simulation studies and a real data application illustrate the practical advantages of the proposed approach.
Problem

Research questions and friction points this paper is trying to address.

calibration estimation
Bregman divergence
survey sampling
auxiliary information
high-dimensional variables
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bregman divergence
calibration weighting
dual representation
asymptotic efficiency
regularized estimation
🔎 Similar Papers
No similar papers found.
J
Jae Kwang Kim
Y
Yonghyun Kwon
Yumou Qiu
Yumou Qiu
Iowa State University
Statistics