🤖 AI Summary
In federated learning (FL), client updates and the global model can inadvertently leak sensitive data, posing compliance risks under regulations such as HIPAA and GDPR—especially in healthcare. Existing local differential privacy (LDP) approaches suffer from excessive resource overhead and fail to guarantee privacy under asynchronous participation, rendering them impractical for high-stakes domains. To address these limitations, we propose L-RDP, a lightweight, rigorously defined LDP mechanism tailored for FL. L-RDP features fixed and significantly reduced memory footprint to mitigate client dropout; enables precise, cumulative privacy budget accounting under asynchronous participation; and operates without centralized trust assumptions. Experiments demonstrate that, while strictly enforcing ε-differential privacy, L-RDP improves model generalization and cross-client fairness, and simultaneously reduces both communication and computational overhead. Thus, L-RDP delivers a verifiable, scalable, and regulation-compliant privacy-preserving framework for deploying FL in highly sensitive application domains.
📝 Abstract
Federated learning (FL) enables organizations to collaboratively train models without sharing their datasets. Despite this advantage, recent studies show that both client updates and the global model can leak private information, limiting adoption in sensitive domains such as healthcare. Local differential privacy (LDP) offers strong protection by letting each participant privatize updates before transmission. However, existing LDP methods were designed for centralized training and introduce challenges in FL, including high resource demands that can cause client dropouts and the lack of reliable privacy guarantees under asynchronous participation. These issues undermine model generalizability, fairness, and compliance with regulations such as HIPAA and GDPR. To address them, we propose L-RDP, a DP method designed for LDP that ensures constant, lower memory usage to reduce dropouts and provides rigorous per-client privacy guarantees by accounting for intermittent participation.