Wasserstein-Cram'er-Rao Theory of Unbiased Estimation

📅 2025-11-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Classical Cramér–Rao theory characterizes only the sampling variance (i.e., instability under independent repeated sampling) of estimators, neglecting their sensitivity to infinitesimal additive perturbations of the underlying data distribution. Method: We develop a novel unbiased estimation theory grounded in Wasserstein geometry, introducing the Wasserstein distance—central to optimal transport—into sensitivity analysis. This yields the Wasserstein–Cramér–Rao lower bound (WCRLB), a fundamental inequality quantifying estimator sensitivity to distributional shifts. Contribution/Results: We derive necessary and sufficient conditions for WCRLB attainability and establish that exponential families admit WCRLB-achieving estimators. Moreover, we prove that Wasserstein projection estimators achieve asymptotically optimal sensitivity. The framework unifies statistical manifolds, gradient operators, and information geometry, providing a foundational inequality tool for robust estimation and distributional perturbation analysis.

Technology Category

Application Category

📝 Abstract
The quantity of interest in the classical Cram'er-Rao theory of unbiased estimation (e.g., the Cram'er-Rao lower bound, its exact attainment for exponential families, and asymptotic efficiency of maximum likelihood estimation) is the variance, which represents the instability of an estimator when its value is compared to the value for an independently-sampled data set from the same distribution. In this paper we are interested in a quantity which represents the instability of an estimator when its value is compared to the value for an infinitesimal additive perturbation of the original data set; we refer to this as the"sensitivity"of an estimator. The resulting theory of sensitivity is based on the Wasserstein geometry in the same way that the classical theory of variance is based on the Fisher-Rao (equivalently, Hellinger) geometry, and this insight allows us to determine a collection of results which are analogous to the classical case: a Wasserstein-Cram'er-Rao lower bound for the sensitivity of any unbiased estimator, a characterization of models in which there exist unbiased estimators achieving the lower bound exactly, and some concrete results that show that the Wasserstein projection estimator achieves the lower bound asymptotically. We use these results to treat many statistical examples, sometimes revealing new optimality properties for existing estimators and other times revealing entirely new estimators.
Problem

Research questions and friction points this paper is trying to address.

Develops sensitivity theory for estimator instability under data perturbations
Establishes Wasserstein-Cramér-Rao lower bound for unbiased estimators
Identifies optimal estimators achieving sensitivity bounds asymptotically
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein geometry replaces Fisher-Rao geometry
Defines sensitivity via infinitesimal data perturbations
Provides Wasserstein-Cramér-Rao lower bound theory
🔎 Similar Papers
No similar papers found.
N
N. G. Trillos
Department of Statistics, University of Wisconsin Madison, WI
A
Adam Quinn Jaffe
Department of Statistics, Columbia University, New York, NY
Bodhisattva Sen
Bodhisattva Sen
Columbia University, New York
Nonparametric statisticsEmpirical BayesOptimal TransportAstro-statistics