Asymptotically Optimal Secure Aggregation for Wireless Federated Learning with Multiple Servers

📅 2025-06-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the security-aggregation latency bottleneck in multi-server wireless federated learning. We propose a privacy-preserving coded aggregation scheme that jointly leverages multi-key secret sharing and artificial noise alignment, achieving—in the information-theoretic sense—the first rigorous privacy guarantee for multi-server settings. By integrating coded computation with normalized delivery time (NDT) modeling, we derive a fundamental lower bound on uplink NDT. We prove that when the number of servers significantly exceeds the number of users, both uplink and downlink NDT achieve optimality; under general conditions, the uplink NDT is at most four times the optimal value. The proposed scheme substantially reduces communication latency while simultaneously ensuring strong security, computational efficiency, and system scalability.

Technology Category

Application Category

📝 Abstract
In this paper, we investigate the transmission latency of the secure aggregation problem in a emph{wireless} federated learning system with multiple curious servers. We propose a privacy-preserving coded aggregation scheme where the servers can not infer any information about the distributed users' local gradients, nor the aggregation value. In our scheme, each user encodes its local gradient into $sK$ confidential messages intended exclusively for different servers using a multi-secret sharing method, and each server forwards the summation of the received confidential messages, while the users sequentially employ artificial noise alignment techniques to facilitate secure transmission. Through these summations, the user can recover the aggregation of all local gradients. We prove the privacy guarantee in the information-theoretic sense and characterize the uplink and downlink communication latency measured by emph{normalized delivery time} (NDT), both of which decrease monotonically with the number of servers $sK$ while increasing over most of the range of the number of users $sM$. Finally, we establish a lower bound on the NDT of the considered system and theoretically prove that the scheme achieves the optimal uplink and downlink NDT under the conditions $sK gg sM gg 0$ and $sK gg sM$, respectively. For arbitrary $sK$ and $sM$, the proposed scheme achieves the optimal uplink NDT within a multiplicative gap of $4$.
Problem

Research questions and friction points this paper is trying to address.

Minimize transmission latency in wireless federated learning
Ensure privacy of user gradients in multi-server systems
Achieve optimal communication latency with secure aggregation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-secret sharing for secure gradient encoding
Artificial noise alignment for secure transmission
Optimal latency via normalized delivery time
Z
Zhenhao Huang
School of Information Science and Technology, ShanghaiTech University, Shanghai, China
K
Kai Liang
School of Information Science and Technology, ShanghaiTech University, Shanghai, China
Yuanming Shi
Yuanming Shi
Professor, ShanghaiTech University
Space Computing NetworksEdge Artificial IntelligenceLarge-Scale Optimization
S
Songze Li
School of Cyber Science and Engineering, Southeast University, Nanjing, China
Y
Youlong Wu
School of Information Science and Technology, ShanghaiTech University, Shanghai, China