Hi-SAFE: Hierarchical Secure Aggregation for Lightweight Federated Learning

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Federated learning in IoT and edge networks faces dual challenges of privacy leakage (e.g., sign-based gradient inference attacks) and excessive communication overhead. Method: This paper proposes the first lightweight, cryptographically secure aggregation framework compatible with sign-based gradient transmission (e.g., SIGNSGD). It constructs a constant-depth majority-voting polynomial via Fermat’s Little Theorem, integrated with hierarchical subgroup partitioning and low-degree polynomial encoding over finite fields to securely aggregate gradient signs. Crucially, it requires no modification to clients’ local update procedures. Contribution/Results: The framework enables provably secure multi-party computation with communication and computational costs independent of client scale. Experiments demonstrate sublinear complexity even in million-device settings—significantly outperforming existing secure aggregation schemes—and establish a new paradigm for jointly optimizing privacy and efficiency in resource-constrained environments.

Technology Category

Application Category

📝 Abstract
Federated learning (FL) faces challenges in ensuring both privacy and communication efficiency, particularly in resource-constrained environments such as Internet of Things (IoT) and edge networks. While sign-based methods, such as sign stochastic gradient descent with majority voting (SIGNSGD-MV), offer substantial bandwidth savings, they remain vulnerable to inference attacks due to exposure of gradient signs. Existing secure aggregation techniques are either incompatible with sign-based methods or incur prohibitive overhead. To address these limitations, we propose Hi-SAFE, a lightweight and cryptographically secure aggregation framework for sign-based FL. Our core contribution is the construction of efficient majority vote polynomials for SIGNSGD-MV, derived from Fermat's Little Theorem. This formulation represents the majority vote as a low-degree polynomial over a finite field, enabling secure evaluation that hides intermediate values and reveals only the final result. We further introduce a hierarchical subgrouping strategy that ensures constant multiplicative depth and bounded per-user complexity, independent of the number of users n.
Problem

Research questions and friction points this paper is trying to address.

Ensuring privacy and communication efficiency in federated learning
Protecting sign-based methods from inference attacks on gradient signs
Reducing overhead of secure aggregation for resource-constrained environments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Secure majority vote polynomials from Fermat's Little Theorem
Hierarchical subgrouping for constant multiplicative depth
Lightweight cryptographic aggregation for sign-based federated learning
🔎 Similar Papers
No similar papers found.