🤖 AI Summary
Classical Cramér–Rao theory characterizes only the sampling variance (i.e., instability under independent repeated sampling) of estimators, neglecting their sensitivity to infinitesimal additive perturbations of the underlying data distribution.
Method: We develop a novel unbiased estimation theory grounded in Wasserstein geometry, introducing the Wasserstein distance—central to optimal transport—into sensitivity analysis. This yields the Wasserstein–Cramér–Rao lower bound (WCRLB), a fundamental inequality quantifying estimator sensitivity to distributional shifts.
Contribution/Results: We derive necessary and sufficient conditions for WCRLB attainability and establish that exponential families admit WCRLB-achieving estimators. Moreover, we prove that Wasserstein projection estimators achieve asymptotically optimal sensitivity. The framework unifies statistical manifolds, gradient operators, and information geometry, providing a foundational inequality tool for robust estimation and distributional perturbation analysis.
📝 Abstract
The quantity of interest in the classical Cram'er-Rao theory of unbiased estimation (e.g., the Cram'er-Rao lower bound, its exact attainment for exponential families, and asymptotic efficiency of maximum likelihood estimation) is the variance, which represents the instability of an estimator when its value is compared to the value for an independently-sampled data set from the same distribution. In this paper we are interested in a quantity which represents the instability of an estimator when its value is compared to the value for an infinitesimal additive perturbation of the original data set; we refer to this as the"sensitivity"of an estimator. The resulting theory of sensitivity is based on the Wasserstein geometry in the same way that the classical theory of variance is based on the Fisher-Rao (equivalently, Hellinger) geometry, and this insight allows us to determine a collection of results which are analogous to the classical case: a Wasserstein-Cram'er-Rao lower bound for the sensitivity of any unbiased estimator, a characterization of models in which there exist unbiased estimators achieving the lower bound exactly, and some concrete results that show that the Wasserstein projection estimator achieves the lower bound asymptotically. We use these results to treat many statistical examples, sometimes revealing new optimality properties for existing estimators and other times revealing entirely new estimators.