Cooperative Local Differential Privacy: Securing Time Series Data in Distributed Environments

📅 2025-07-21
🏛️ IEEE International Conference on Mobile Cloud Computing, Services, and Engineering
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address privacy leakage in distributed time-series data under local differential privacy (LDP), where fixed-window noise can be inadvertently canceled out during aggregation, this paper proposes a collaborative LDP mechanism. The method enables multiple users to jointly generate and allocate structured noise vectors, ensuring that individual perturbations cancel *directionally*—rather than stochastically—during aggregation, thereby preserving strict ε-LDP while significantly improving statistical accuracy. Its key innovation lies in shifting noise design from independent and identically distributed (i.i.d.) assumptions to coordinated, controllable construction, thus overcoming traditional LDP’s inability to model temporal dependencies. Experiments on real-world time-series datasets from healthcare and transportation domains demonstrate that the mechanism improves data utility by 23.6% over baseline LDP approaches and effectively resists aggregate-based inference attacks against individuals, enabling scalable, real-time privacy protection.

Technology Category

Application Category

📝 Abstract
The rapid growth of smart devices—phones, wearables, IoT sensors, and connected vehicles—has led to an explosion of continuous time series data that offers valuable insights in healthcare, transportation, and more. However, this surge raises significant privacy concerns, as sensitive patterns can reveal personal details. While traditional differential privacy (DP) relies on trusted servers, local differential privacy (LDP) enables users to perturb their own data. However, traditional LDP methods perturb time series data by adding user-specific noise but exhibit vulnerabilities. For instance, noise applied within fixed time windows can be canceled during aggregation (e.g., averaging), enabling adversaries to infer individual statistics over time, thereby eroding privacy guarantees. To address these issues, we introduce a Cooperative Local Differential Privacy (CLDP) mechanism that enhances privacy by distributing noise vectors across multiple users. In our approach, noise is collaboratively generated and assigned so that when all users’ perturbed data is aggregated, the noise cancels out-preserving overall statistical properties while protecting individual privacy. This cooperative strategy not only counters vulnerabilities inherent in time-window-based methods but also scales effectively for large, real-time datasets, striking a better balance between data utility and privacy in multi-user environments.
Problem

Research questions and friction points this paper is trying to address.

Protecting time series data privacy in distributed smart device environments
Addressing vulnerabilities in local differential privacy for temporal data protection
Preventing noise cancellation during aggregation of time series data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cooperative noise distribution across multiple users
Collaborative noise generation for aggregation cancellation
Scales effectively for large real-time datasets
🔎 Similar Papers
No similar papers found.
B
B. Singh
School of Cybersecurity, Old Dominion University, Norfolk, VA 23529, USA
M
Md Jakir Hossain
Department of Computer Science and Engineering, Bangladesh University of Engineering and Technology, Dhaka
Rafael Diaz
Rafael Diaz
Professor, Old Dominion University
Digital Supply Chains and Cybersecurity
S
Sandip Roy
Center for Secure & Intelligent Critical Systems, Old Dominion University, Suffolk, VA 23435, USA
R
R. Mukkamala
Department of Computer Science, Old Dominion University, Norfolk, VA 23529, USA
Sachin Shetty
Sachin Shetty
Old Dominion University
BlockchainCyber ResilienceTrustworthy AI