π€ AI Summary
This work addresses the limitations of existing asymptotic theory for induced order statistics (IOS), which relies on overly strong smoothness assumptions when covariate dimension grows and fails to accommodate boundary-point settings such as regression discontinuity designs. By introducing a quadratic mean differentiability condition together with Taylor- or HΓΆlder-type remainder bounds, the paper establishes marginal and joint convergence rates for IOS under both Hellinger and total variation distances under substantially weaker assumptions. The proposed framework uniformly handles interior and boundary cases, elucidates the trade-off between smoothness and convergence speed, specifies precise growth conditions on the number of nearest neighbors, and uncovers distinct behaviors of the two distance metrics under different mechanisms. The derived rates are directly applicable to regression discontinuity, k-nearest neighbor methods, and distributionally robust optimization, significantly broadening the scope of IOS theory.
π Abstract
Induced order statistics (IOS) arise when sample units are reordered according to the value of an auxiliary variable, and the associated responses are analyzed in that induced order. IOS play a central role in applications where the goal is to approximate the conditional distribution of an outcome at a fixed covariate value using observations whose covariates lie closest to that point, including regression discontinuity designs, k-nearest-neighbor methods, and distributionally robust optimization. Existing asymptotic results allow the dimension of the IOS vector to grow with the sample size only under smoothness conditions that are often too restrictive for practical data-generating processes. In particular, these conditions rule out boundary points, which are central to regression discontinuity designs. This paper develops general convergence rates for IOS under primitive and comparatively weak assumptions. We derive sharp marginal rates for the approximation of the target conditional distribution in Hellinger and total variation distances under quadratic mean differentiability and show how these marginal rates translate into joint convergence rates for the IOS vector. Our results are widely applicable: they rely on a standard smoothness condition and accommodate both interior and boundary conditioning points, as required in regression discontinuity and related settings. In the supplementary appendix, we provide complementary results under a Taylor/Holder remainder condition. Our results reveal a clear trade-off between smoothness and speed of convergence, identify regimes in which Hellinger and total variation distances behave differently, and provide explicit growth conditions on the number of nearest neighbors.