FedHENet: A Frugal Federated Learning Framework for Heterogeneous Environments

📅 2026-02-13
📈 Citations: 0
Influential: 0
📄 PDF

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) enables collaborative training without centralizing data, essential for privacy compliance in real-world scenarios involving sensitive visual information. Most FL approaches rely on expensive, iterative deep network optimization, which still risks privacy via shared gradients. In this work, we propose FedHENet, extending the FedHEONN framework to image classification. By using a fixed, pre-trained feature extractor and learning only a single output layer, we avoid costly local fine-tuning. This layer is learned by analytically aggregating client knowledge in a single round of communication using homomorphic encryption (HE). Experiments show that FedHENet achieves competitive accuracy compared to iterative FL baselines while demonstrating superior stability performance and up to 70\% better energy efficiency. Crucially, our method is hyperparameter-free, removing the carbon footprint associated with hyperparameter tuning in standard FL. Code available in https://github.com/AlejandroDopico2/FedHENet/
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Heterogeneous Environments
Privacy Preservation
Energy Efficiency
Hyperparameter Tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Learning
Homomorphic Encryption
Energy Efficiency
Hyperparameter-Free
Single-Round Aggregation
🔎 Similar Papers
No similar papers found.