🤖 AI Summary
This paper addresses the general parametric estimation problem without distributional assumptions, aiming to construct estimators robust to both model misspecification and complex data dependence structures. We propose a minimum distance estimation framework based on the Maximum Mean Discrepancy (MMD), establishing— for the first time—its statistical consistency under non-i.i.d. sampling and model misspecification, and providing theoretical convergence guarantees for stochastic gradient descent optimization. Theoretically, the estimator exhibits intrinsic robustness to temporal dependence, outliers, and distributional shifts. Numerical experiments demonstrate its superior performance over classical M-estimators under data contamination and intricate time-series settings. Our core contribution lies in unifying the characterization of the MMD estimator’s generalization error and robustness limits, thereby offering a novel paradigm for reliable inference with nonstandard data.
📝 Abstract
Many works in statistics aim at designing a universal estimation procedure, that is, an estimator that would converge to the best approximation of the (unknown) data generating distribution in a model, without any assumption on this distribution. This question is of major interest, in particular because the universality property leads to the robustness of the estimator. In this paper, we tackle the problem of universal estimation using a minimum distance estimator presented in Briol et al. (2019) based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations.