Differentially Private Model-X Knockoffs via Johnson-Lindenstrauss Transform

📅 2025-08-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of simultaneously achieving differential privacy and false discovery rate (FDR) control in high-dimensional variable selection, this paper proposes the first statistical framework that achieves asymptotic FDR control under $(varepsilon,delta)$-differential privacy. Methodologically, it applies Gaussian Johnson–Lindenstrauss random projections to privatize the Model-X knockoff matrix—preserving approximate isometry of covariate structure while maintaining exchangeability, unlike conventional noise-addition mechanisms that break it. A novel debiased estimator is further developed to characterize the theoretical trade-off between privacy budget and statistical power. Theoretically, the method guarantees asymptotic FDR control and attains statistical power approaching one in the high-dimensional regime. Empirically, it substantially outperforms baseline noisy mechanisms under stringent privacy budgets.

Technology Category

Application Category

📝 Abstract
We introduce a novel privatization framework for high-dimensional controlled variable selection. Our framework enables rigorous False Discovery Rate (FDR) control under differential privacy constraints. While the Model-X knockoff procedure provides FDR guarantees by constructing provably exchangeable ``negative control" features, existing privacy mechanisms like Laplace or Gaussian noise injection disrupt its core exchangeability conditions. Our key innovation lies in privatizing the data knockoff matrix through the Gaussian Johnson-Lindenstrauss Transformation (JLT), a dimension reduction technique that simultaneously preserves covariate relationships through approximate isometry for $(ε,δ)$-differential privacy. We theoretically characterize both FDR and the power of the proposed private variable selection procedure, in an asymptotic regime. Our theoretical analysis characterizes the role of different factors, such as the JLT's dimension reduction ratio, signal-to-noise ratio, differential privacy parameters, sample size and feature dimension, in shaping the privacy-power trade-off. Our analysis is based on a novel `debiasing technique' for high-dimensional private knockoff procedure. We further establish sufficient conditions under which the power of the proposed procedure converges to one. This work bridges two critical paradigms -- knockoff-based FDR control and private data release -- enabling reliable variable selection in sensitive domains. Our analysis demonstrates that structural privacy preservation through random projections outperforms the classical noise addition mechanism, maintaining statistical power even under strict privacy budgets.
Problem

Research questions and friction points this paper is trying to address.

Privatizing high-dimensional variable selection under differential privacy
Maintaining false discovery rate control with privacy constraints
Preserving statistical power while ensuring data privacy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Johnson-Lindenstrauss Transform for privacy
Gaussian noise injection preserving exchangeability
Debiasing technique for high-dimensional knockoffs
🔎 Similar Papers
No similar papers found.