🤖 AI Summary
To address the sensitivity of Fréchet means to outliers on Riemannian manifolds, this paper introduces the Huber mean—the first location estimator extending Huber’s M-estimation to Riemannian manifolds. It combines squared and absolute loss functions, achieving a breakdown point of 0.5—the highest attainable for equivariant estimators under isometries—while preserving computational efficiency. Theoretically, we establish uniqueness, statistical consistency, asymptotic normality, and a central limit theorem; further, we develop a framework for robust one-sample hypothesis testing and confidence region construction. Methodologically, the approach integrates Riemannian geometry, M-estimation theory, and robust covariance estimation on manifolds. Numerical experiments on the sphere and the manifold of symmetric positive-definite (SPD) matrices demonstrate that the Huber mean significantly outperforms classical Fréchet and geometric medians under heavy-tailed distributions and outlier contamination.
📝 Abstract
This article introduces Huber means on Riemannian manifolds, providing a robust alternative to the Frechet mean by integrating elements of both square and absolute loss functions. The Huber means are designed to be highly resistant to outliers while maintaining efficiency, making it a valuable generalization of Huber's M-estimator for manifold-valued data. We comprehensively investigate the statistical and computational aspects of Huber means, demonstrating their utility in manifold-valued data analysis. Specifically, we establish minimal conditions for ensuring the existence and uniqueness of the Huber mean and discuss regularity conditions for unbiasedness. The Huber means are statistically consistent and enjoy the central limit theorem. Additionally, we propose a moment-based estimator for the limiting covariance matrix, which is used to construct a robust one-sample location test procedure and an approximate confidence region for location parameters. Huber means are shown to be highly robust and efficient in the presence of outliers or under heavy-tailed distribution. To be more specific, it achieves a breakdown point of at least 0.5, the highest among all isometric equivariant estimators, and is more efficient than the Frechet mean under heavy-tailed distribution. Numerical examples on spheres and the set of symmetric positive-definite matrices further illustrate the efficiency and reliability of the proposed Huber means on Riemannian manifolds.