A Geometric Unification of Distributionally Robust Covariance Estimators: Shrinking the Spectrum by Inflating the Ambiguity Set

📅 2024-05-30
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Existing spectral shrinkage methods for high-dimensional covariance estimation rely either on heuristic objectives or strong distributional assumptions. To address this, we propose a geometrically unified framework grounded in distributionally robust optimization (DRO). Our approach defines geometric divergence measures—including KL, Fisher–Rao, and Wasserstein divergences—on the covariance manifold, constructs data-driven ambiguity sets, and automatically derives spectral shrinkage estimators by minimizing the worst-case Frobenius error. Crucially, we establish for the first time that the functional form of shrinkage is intrinsically determined by the geometric structure of the chosen divergence, thereby eliminating dependence on prior distributional assumptions. We prove asymptotic consistency of the proposed estimator and derive an $O(1/n)$ finite-sample error bound. Moreover, we obtain three explicit robust estimators, which match state-of-the-art performance on both synthetic and real-world datasets.

Technology Category

Application Category

📝 Abstract
The state-of-the-art methods for estimating high-dimensional covariance matrices all shrink the eigenvalues of the sample covariance matrix towards a data-insensitive shrinkage target. The underlying shrinkage transformation is either chosen heuristically - without compelling theoretical justification - or optimally in view of restrictive distributional assumptions. In this paper, we propose a principled approach to construct covariance estimators without imposing restrictive assumptions. That is, we study distributionally robust covariance estimation problems that minimize the worst-case Frobenius error with respect to all data distributions close to a nominal distribution, where the proximity of distributions is measured via a divergence on the space of covariance matrices. We identify mild conditions on this divergence under which the resulting minimizers represent shrinkage estimators. We show that the corresponding shrinkage transformations are intimately related to the geometrical properties of the underlying divergence. We also prove that our robust estimators are efficiently computable and asymptotically consistent and that they enjoy finite-sample performance guarantees. We exemplify our general methodology by synthesizing explicit estimators induced by the Kullback-Leibler, Fisher-Rao, and Wasserstein divergences. Numerical experiments based on synthetic and real data show that our robust estimators are competitive with state-of-the-art estimators.
Problem

Research questions and friction points this paper is trying to address.

Unifying robust covariance estimators via geometric divergence principles
Minimizing Frobenius error under distributional ambiguity constraints
Developing computable shrinkage estimators with performance guarantees
Innovation

Methods, ideas, or system contributions that make the work stand out.

Distributionally robust covariance estimation via divergence
Shrinkage estimators with geometrical divergence properties
Efficiently computable robust estimators with guarantees
🔎 Similar Papers
No similar papers found.