Privacy-Preserving Federated Convex Optimization: Balancing Partial-Participation and Efficiency via Noise Cancellation

📅 2025-06-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of simultaneously achieving differential privacy (DP) and high convergence efficiency in federated learning (FL) under partial client participation, this paper proposes a novel gradient perturbation mechanism based on noise cancellation. Within the stochastic convex optimization (SCO) framework, our approach is the first to achieve strict DP guarantees alongside the optimal convergence rate of $O(1/sqrt{T})$ in the partial-participation setting—matching the rate of non-private, full-participation baselines. By jointly designing variance-reduction and noise-compensation strategies, we effectively eliminate the excess variance induced by participation sparsity, significantly outperforming existing DP-FL methods. The framework accommodates both heterogeneous and homogeneous data distributions, requires no additional communication or computational overhead, and achieves theoretical optimality while remaining practically deployable.

Technology Category

Application Category

📝 Abstract
This paper tackles the challenge of achieving Differential Privacy (DP) in Federated Learning (FL) under partial-participation, where only a subset of the machines participate in each time-step. While previous work achieved optimal performance in full-participation settings, these methods struggled to extend to partial-participation scenarios. Our approach fills this gap by introducing a novel noise-cancellation mechanism that preserves privacy without sacrificing convergence rates or computational efficiency. We analyze our method within the Stochastic Convex Optimization (SCO) framework and show that it delivers optimal performance for both homogeneous and heterogeneous data distributions. This work expands the applicability of DP in FL, offering an efficient and practical solution for privacy-preserving learning in distributed systems with partial participation.
Problem

Research questions and friction points this paper is trying to address.

Achieving Differential Privacy in Federated Learning with partial-participation
Extending noise-cancellation to maintain privacy and efficiency
Optimizing performance for heterogeneous data in distributed systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Noise-cancellation mechanism for privacy preservation
Optimal performance in partial-participation FL
Efficient convergence for heterogeneous data distributions
🔎 Similar Papers
No similar papers found.
R
Roie Reshef
Faculty of Electrical and Computer Engineering, Technion, Haifa, Israel
Kfir Yehuda Levy
Kfir Yehuda Levy
Associate Professor at Technion - Israel Institute of Technology
Machine LearningStochastic Optimization