A Multi-Prototype-Guided Federated Knowledge Distillation Approach in AI-RAN Enabled Multi-Access Edge Computing System

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the performance degradation of federated learning in AI-RAN-enabled multi-access edge computing caused by non-independent and identically distributed (non-IID) data. To mitigate the adverse effects of data heterogeneity, the authors propose a Multi-Prototype guided Federated Knowledge Distillation (MP-FedKD) framework. MP-FedKD incorporates a self-knowledge distillation mechanism, leverages Conditional Hierarchical Agglomerative Clustering (CHAC) to generate multiple prototype representations, and introduces a novel Local-Global Embedding Matching loss with Prototype guidance (LEMGP) to align prototypes and preserve semantic information across clients. Extensive experiments across multiple datasets under diverse non-IID settings demonstrate that MP-FedKD consistently outperforms state-of-the-art methods in terms of classification accuracy, average accuracy, and regression error metrics (RMSE and MAE).

Technology Category

Application Category

📝 Abstract
With the development of wireless network, Multi-Access Edge Computing (MEC) and Artificial Intelligence (AI)-native Radio Access Network (RAN) have attracted significant attention. Particularly, the integration of AI-RAN and MEC is envisioned to transform network efficiency and responsiveness. Therefore, it is valuable to investigate AI-RAN enabled MEC system. Federated learning (FL) nowadays is emerging as a promising approach for AI-RAN enabled MEC system, in which edge devices are enabled to train a global model cooperatively without revealing their raw data. However, conventional FL encounters the challenge in processing the non-independent and identically distributed (non-IID) data. Single prototype obtained by averaging the embedding vectors per class can be employed in FL to handle the data heterogeneity issue. Nevertheless, this may result in the loss of useful information owing to the average operation. Therefore, in this paper, a multi-prototype-guided federated knowledge distillation (MP-FedKD) approach is proposed. Particularly, self-knowledge distillation is integrated into FL to deal with the non-IID issue. To cope with the problem of information loss caused by single prototype-based strategy, multi-prototype strategy is adopted, where we present a conditional hierarchical agglomerative clustering (CHAC) approach and a prototype alignment scheme. Additionally, we design a novel loss function (called LEMGP loss) for each local client, where the relationship between global prototypes and local embedding will be focused. Extensive experiments over multiple datasets with various non-IID settings showcase that the proposed MP-FedKD approach outperforms the considered state-of-the-art baselines regarding accuracy, average accuracy and errors (RMSE and MAE).
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
non-IID data
Multi-Access Edge Computing
AI-RAN
data heterogeneity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Knowledge Distillation
Multi-Prototype Learning
Non-IID Data
Edge Intelligence
Prototype Alignment
🔎 Similar Papers
No similar papers found.
L
Luyao Zou
College of Computing and Informatics, Sungkyunkwan University, Republic of Korea
Hayoung Oh
Hayoung Oh
Sungkyunkwan University
artificial intelligence
C
Chu Myaet Thwal
Department of Computer Science and Engineering, Kyung Hee University, Yongin-si, Gyeonggi-do, 17104, Republic of Korea
A
Apurba Adhikary
Department of Information and Communication Engineering, Noakhali Science and Technology University, Noakhali-3814, Bangladesh
S
Seohyeon Hong
College of Computing and Informatics, Sungkyunkwan University, Republic of Korea
Zhu Han
Zhu Han
University of Houston
Game TheoryWireless NetworkingSecurityData ScienceSmart Grid