Debiased Inference for High-Dimensional Regression Models Based on Profile M-Estimation

📅 2025-12-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional regression, regularization-induced bias undermines valid statistical inference; existing debiasing methods rely on explicit construction of nuisance parameter projections to satisfy Neyman orthogonality—a requirement often infeasible for complex models. This paper proposes DPME, a general-purpose debiasing framework that integrates profile M-estimation with first-order gradient-based Newton updates, enabling asymptotically linear correction via numerical differentiation—eliminating the need for analytical gradients, explicit projections, or model-specific orthogonalization. We establish √n-consistency and asymptotic normality of the DPME estimator under mild regularity conditions. Simulation studies demonstrate substantially improved confidence interval coverage and markedly reduced computational cost compared to existing approaches. Empirical validation on estimating optimal treatment rules for multiple myeloma further confirms its practical efficacy.

Technology Category

Application Category

📝 Abstract
Debiased inference for high-dimensional regression models has received substantial recent attention to ensure regularized estimators have valid inference. All existing methods focus on achieving Neyman orthogonality through explicitly constructing projections onto the space of nuisance parameters, which is infeasible when an explicit form of the projection is unavailable. We introduce a general debiasing framework, Debiased Profile M-Estimation (DPME), which applies to a broad class of models and does not require model-specific Neyman orthogonalization or projection derivations as in existing methods. Our approach begins by obtaining an initial estimator of the parameters by optimizing a penalized objective function. To correct for the bias introduced by penalization, we construct a one-step estimator using the Newton-Raphson update, applied to the gradient of a profile function defined as the optimal objective function with the parameter of interest held fixed. We use numerical differentiation without requiring the explicit calculation of the gradients. The resulting DPME estimator is shown to be asymptotically linear and normally distributed. Through extensive simulations, we demonstrate that the proposed method achieves better coverage rates than existing alternatives, with largely reduced computational cost. Finally, we illustrate the utility of our method with an application to estimating a treatment rule for multiple myeloma.
Problem

Research questions and friction points this paper is trying to address.

Debiasing high-dimensional regression models for valid inference
Overcoming infeasibility of explicit projection in nuisance parameter space
Providing a general framework without model-specific orthogonalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses profile function with Newton-Raphson update for bias correction
Employs numerical differentiation without explicit gradient calculation
Applies to broad models without model-specific orthogonalization derivations
🔎 Similar Papers
No similar papers found.