🤖 AI Summary
This paper investigates risk bounds for nonparametric distribution regression estimators, focusing on theoretical analysis of the Continuous Ranked Probability Score (CRPS) and in-domain worst-case mean squared error (MSE) under both convex and non-convex constraints. Methodologically, it extends isotonic and trend filtering to the distribution regression setting and integrates convex optimization, monotone regression, nonparametric inference, and neural network modeling. The key contributions are: (i) the first unified framework establishing general upper bounds on CRPS and worst-case MSE for both constraint classes; (ii) attainment of minimax-optimal convergence rates—matching those of mean estimation—under mild regularity conditions; and (iii) empirical validation via simulations and real-data experiments, confirming the tightness of the derived bounds and demonstrating significant improvements over baselines in both CRPS and MSE, with convergence behavior fully aligned with theoretical predictions.
📝 Abstract
This work examines risk bounds for nonparametric distributional regression estimators. For convex-constrained distributional regression, general upper bounds are established for the continuous ranked probability score (CRPS) and the worst-case mean squared error (MSE) across the domain. These theoretical results are applied to isotonic and trend filtering distributional regression, yielding convergence rates consistent with those for mean estimation. Furthermore, a general upper bound is derived for distributional regression under non-convex constraints, with a specific application to neural network-based estimators. Comprehensive experiments on both simulated and real data validate the theoretical contributions, demonstrating their practical effectiveness.