Robust Inference for Convex Pairwise Difference Estimators

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the statistical inference instability in convex pairwise difference estimation caused by bandwidth sensitivity, overcoming the traditional reliance on stringent bandwidth conditions for asymptotic normality. Methodologically, it introduces a small-bandwidth asymptotic framework, constructs a kernel-weighted convex objective function to match observation pairs with similar covariates, proposes a debiased generalized jackknife for bias correction, and develops a variance-tunable adaptive bootstrap for valid inference across a broad bandwidth range. Integrating convex optimization, kernel-weighted estimation, and resampling techniques, the approach achieves Gaussian approximation and constructs uniformly valid confidence intervals under weak regularity assumptions. Empirically, the proposed method substantially enhances inference robustness while preserving theoretical rigor and practical feasibility.

Technology Category

Application Category

📝 Abstract
This paper develops distribution theory and bootstrap-based inference methods for a broad class of convex pairwise difference estimators. These estimators minimize a kernel-weighted convex-in-parameter function over observation pairs that are similar in terms of certain covariates, where the similarity is governed by a localization (bandwidth) parameter. While classical results establish asymptotic normality under restrictive bandwidth conditions, we show that valid Gaussian and bootstrap-based inference remains possible under substantially weaker assumptions. First, we extend the theory of small bandwidth asymptotics to convex pairwise estimation settings, deriving robust Gaussian approximations even when a smaller than standard bandwidth is used. Second, we employ a debiasing procedure based on generalized jackknifing to enable inference with larger bandwidths, while preserving convexity of the objective function. Third, we construct a novel bootstrap method that adjusts for bandwidth-induced variance distortions, yielding valid inference across a wide range of bandwidth choices. Our proposed inference method enjoys demonstrable more robustness, while retaining the practical appeal of convex pairwise difference estimators.
Problem

Research questions and friction points this paper is trying to address.

Develops robust inference methods for convex pairwise estimators
Extends small bandwidth asymptotics theory for Gaussian approximations
Proposes debiasing and bootstrap methods for bandwidth robustness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends small bandwidth asymptotics for convex pairwise estimation
Employs debiasing via generalized jackknifing for larger bandwidths
Constructs bootstrap method adjusting bandwidth-induced variance distortions
🔎 Similar Papers
No similar papers found.