🤖 AI Summary
To address the challenge of model-agnostic variable selection in high-dimensional settings, this paper proposes Knoop—a novel knockoff-based method that introduces overparameterization into the knockoff framework for the first time, mitigating its construction difficulty and power degradation under small-sample regimes. Methodologically, Knoop leverages deep generative models to learn adaptable knockoff features, incorporates an adaptive thresholding strategy, provides theoretical finite-sample false discovery rate (FDR) control, and enhances randomization stability. Extensive experiments across multivariate linear regression, generalized linear models, and nonlinear scenarios demonstrate that, while strictly controlling FDR at level α = 0.1, Knoop achieves average power gains of 23%–41% over existing knockoff methods, with manageable computational overhead. These advances significantly improve the practicality and reliability of knockoffs in real-world high-dimensional inference tasks.