Differential Privacy for Regulatory Compliance in Cyberattack Detection on Critical Infrastructure Systems

📅 2025-08-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the tension between regulatory compliance and data privacy in multi-stakeholder critical infrastructure settings, this paper proposes a differential privacy (DP)-enabled hypothesis testing framework for network intrusion detection. Methodologically, it introduces a two-stage privacy-preserving mechanism: joint perturbation of the covariance matrix and sensor-driven test statistics, ensuring statistical power while adhering to strict privacy budgets (ε ≤ 1.0). Theoretically, the framework’s misclassification error is proven to asymptotically converge to that of the non-private baseline. Empirical evaluation on real-world industrial control system data demonstrates robust detection accuracy (>92%) across diverse attack scenarios, coupled with strong DP guarantees. The core contribution lies in the first deep integration of DP into the statistical inference pipeline of industrial control system anomaly detection—thereby simultaneously satisfying regulatory auditability and multi-party data sovereignty requirements.

Technology Category

Application Category

📝 Abstract
Industrial control systems are a fundamental component of critical infrastructure networks (CIN) such as gas, water and power. With the growing risk of cyberattacks, regulatory compliance requirements are also increasing for large scale critical infrastructure systems comprising multiple utility stakeholders. The primary goal of regulators is to ensure overall system stability with recourse to trustworthy stakeholder attack detection. However, adhering to compliance requirements requires stakeholders to also disclose sensor and control data to regulators raising privacy concerns. In this paper, we present a cyberattack detection framework that utilizes differentially private (DP) hypothesis tests geared towards enhancing regulatory confidence while alleviating privacy concerns of CIN stakeholders. The hallmark of our approach is a two phase privacy scheme that protects the privacy of covariance, as well as the associated sensor driven test statistics computed as a means to generate alarms. Theoretically, we show that our method induces a misclassification error rate comparable to the non-DP cases while delivering robust privacy guarantees. With the help of real-world datasets, we show the reliability of our DP-detection outcomes for a wide variety of attack scenarios for interdependent stakeholders.
Problem

Research questions and friction points this paper is trying to address.

Ensuring privacy in cyberattack detection for critical infrastructure
Balancing regulatory compliance with stakeholder data privacy
Reducing misclassification error in differentially private detection methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentially private hypothesis tests for detection
Two-phase privacy scheme for covariance protection
Comparable error rate with robust privacy guarantees
🔎 Similar Papers
No similar papers found.
Paritosh Ramanan
Paritosh Ramanan
Oklahoma State University
Decentralized OptimizationDistributed ComputingBlockchainFederated LearningDifferential Privacy
H
H. M. Mohaimanul Islam
School of Industrial Engineering and Management, Oklahoma State University
A
Abhiram Reddy Alugula
School of Industrial Engineering and Management, Oklahoma State University