Triple/Double-Debiased Lasso

πŸ“… 2026-03-20
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In high-dimensional linear regression, regularization-induced first- and second-order biases can severely degrade the finite-sample inference accuracy and confidence interval coverage for low-dimensional parameters. This work proposes a triple (or double) debiased Lasso estimator that, for the first time, constructs moment functions satisfying both first- and second-order Neyman orthogonality conditions. By doing so, it effectively eliminates dominant first-order and second-order biases, substantially reducing the remainder term order in the asymptotic linear representation. The paper further provides a recursive formula for constructing higher-order orthogonal moment functions. Both theoretical analysis and Monte Carlo simulations demonstrate that, compared to the standard double Lasso, the proposed method significantly improves inference accuracy and confidence interval coverage.

Technology Category

Application Category

πŸ“ Abstract
In this paper, we propose a triple (or double-debiased) Lasso estimator for inference on a low-dimensional parameter in high-dimensional linear regression models. The estimator is based on a moment function that satisfies not only first- but also second-order Neyman orthogonality conditions, thereby eliminating both the leading bias and the second-order bias induced by regularization. We derive an asymptotic linear representation for the proposed estimator and show that its remainder terms are never larger and are often smaller in order than those in the corresponding asymptotic linear representation for the standard double Lasso estimator. Because of this improvement, the triple Lasso estimator often yields more accurate finite-sample inference and confidence intervals with better coverage. Monte Carlo simulations confirm these gains. In addition, we provide a general recursive formula for constructing higher-order Neyman orthogonal moment functions in Z-estimation problems, which underlies the proposed estimator as a special case.
Problem

Research questions and friction points this paper is trying to address.

high-dimensional inference
Neyman orthogonality
regularization bias
Lasso
finite-sample accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Triple-debiased Lasso
Neyman orthogonality
High-dimensional inference
Bias reduction
Z-estimation
πŸ”Ž Similar Papers
No similar papers found.