Knoop: practical enhancement of knockoff with over-parameterization for variable selection

📅 2025-01-01
🏛️ Machine-mediated learning
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of model-agnostic variable selection in high-dimensional settings, this paper proposes Knoop—a novel knockoff-based method that introduces overparameterization into the knockoff framework for the first time, mitigating its construction difficulty and power degradation under small-sample regimes. Methodologically, Knoop leverages deep generative models to learn adaptable knockoff features, incorporates an adaptive thresholding strategy, provides theoretical finite-sample false discovery rate (FDR) control, and enhances randomization stability. Extensive experiments across multivariate linear regression, generalized linear models, and nonlinear scenarios demonstrate that, while strictly controlling FDR at level α = 0.1, Knoop achieves average power gains of 23%–41% over existing knockoff methods, with manageable computational overhead. These advances significantly improve the practicality and reliability of knockoffs in real-world high-dimensional inference tasks.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

Feature Selection
Model Accuracy
Efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variable Selection
Anomaly Detection
Regression Model
🔎 Similar Papers
No similar papers found.