Energy Efficient Federated Learning with Hyperdimensional Computing (HDC)

📅 2026-02-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a lightweight federated learning framework for wireless edge networks that integrates hyperdimensional computing (HDC) with differential privacy (DP) to address high energy consumption, substantial computational overhead, and privacy leakage risks. HDC is introduced into federated learning for the first time, and a joint optimization of HDC dimensionality, transmission power, and CPU frequency is formulated. To enable efficient resource allocation, a hybrid search strategy—combining outer-layer enumeration with inner-layer one-dimensional search—is devised. The proposed approach significantly reduces energy consumption while preserving privacy, achieving up to an 83.3% reduction in total energy usage compared to baseline methods, without compromising model accuracy or convergence speed.

Technology Category

Application Category

📝 Abstract
This paper investigates the problem of minimizing total energy consumption for secure federated learning (FL) in wireless edge networks, a key paradigm for decentralized big data analytics. To tackle the high computational cost and privacy challenges of processing large-scale distributed data with conventional neural networks, we propose an FL with hyperdimensional computing and differential privacy (FL-HDC-DP) framework. Each edge device employs hyperdimensional computing (HDC) for lightweight local training and applies differential privacy (DP) noise to protect transmitted model updates. The total energy consumption is minimized through a joint optimization of the HDC dimension, transmit power, and CPU frequency. An efficient hybrid algorithm is developed, combining an outer enumeration search for HDC dimensions with an inner one-dimensional search for resource allocation. Simulation results show that the proposed framework achieves up to 83.3% energy reduction compared with baseline schemes, while maintaining high accuracy and faster convergence.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Energy Efficiency
Hyperdimensional Computing
Differential Privacy
Wireless Edge Networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hyperdimensional Computing
Federated Learning
Differential Privacy
Energy Efficiency
Edge Computing
🔎 Similar Papers
No similar papers found.