Bayesian Controlled FDR Variable Selection via Knockoffs

📅 2024-11-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of simultaneously controlling the false discovery rate (FDR), ensuring selection stability, and maintaining high statistical power in high-dimensional variable selection. We propose a fully Bayesian “X-knockoff” method that jointly models the response and covariates. Crucially, it is the first to embed a graph-structured prior—such as a Gaussian graphical model—into the knockoff generation process. Variable selection is performed via a modified spike-and-slab prior coupled with an upper bound on the posterior non-inclusion probability. We establish theoretical guarantees: the method strictly controls the Bayesian FDR (BFDR) in finite samples and exhibits robustness to errors in precision matrix estimation. Extensive simulations and real-data analyses demonstrate that, compared to classical knockoff and state-of-the-art Bayesian approaches, our method achieves significantly improved selection stability and more accurate FDR control, while retaining superior statistical power.

Technology Category

Application Category

📝 Abstract
In many research fields, researchers aim to identify significant associations between a set of explanatory variables and a response while controlling the false discovery rate (FDR). To this aim, we develop a fully Bayesian generalization of the classical model-X knockoff filter. Knockoff filter introduces controlled noise in the model in the form of cleverly constructed copies of the predictors as auxiliary variables. In our approach we consider the joint model of the covariates and the response and incorporate the conditional independence structure of the covariates into the prior distribution of the auxiliary knockoff variables. We further incorporate the estimation of a graphical model among the covariates, which in turn aids knockoffs generation and improves the estimation of the covariate effects on the response. We use a modified spike-and-slab prior on the regression coefficients, which avoids the increase of the model dimension as typical in the classical knockoff filter. Our model performs variable selection using an upper bound on the posterior probability of non-inclusion. We show how our model construction leads to valid model-X knockoffs and demonstrate that the proposed characterization is sufficient for controlling the BFDR at an arbitrary level, in finite samples. We also show that the model selection is robust to the estimation of the precision matrix. We use simulated data to demonstrate that our proposal increases the stability of the selection with respect to classical knockoff methods, as it relies on the entire posterior distribution of the knockoff variables instead of a single sample. With respect to Bayesian variable selection methods, we show that our selection procedure achieves comparable or better performances, while maintaining control over the FDR. Finally, we show the usefulness of the proposed model with an application to real data.
Problem

Research questions and friction points this paper is trying to address.

Develops Bayesian knockoff filter for FDR-controlled variable selection
Improves knockoff generation via covariate graphical model estimation
Controls Bayesian FDR in finite and asymptotic scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian generalization of model-X knockoff filter
Joint modeling of covariates and response variables
Modified spike-and-slab prior for variable selection
🔎 Similar Papers
No similar papers found.
L
Lorenzo Focardi-Olmi
Department of Statistics, Computer Science, Application "G.Parenti", University of Florence, Italy
A
A. Gottard
Department of Statistics, Computer Science, Application "G.Parenti", University of Florence, Italy
Michele Guindani
Michele Guindani
Department of Biostatistics, University of California, Los Angeles
Bayesian AnalysisBayesian NonparametricsNeuroimagingImaging GeneticsStatistical decision making
Marina Vannucci
Marina Vannucci
Noah Harding Professor of Statistics, Rice University
Bayesian StatisticsGraphical ModelsStatistical ComputingVariable SelectionWavelets