RefProtoFL: Communication-Efficient Federated Learning via External-Referenced Prototype Alignment

๐Ÿ“… 2026-01-21
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenges of limited communication bandwidth and degraded model generalization in federated learning under edge scenarios, primarily caused by data heterogeneity. To mitigate these issues, the authors propose RefProtoFL, a novel framework that leverages a small set of public data on the server to construct external reference prototypes, aligning client-side feature representations and thereby alleviating heterogeneity. Communication overhead is significantly reduced by transmitting only lightweight adapter parameters, enhanced with an Adaptive Probability-based Update Dropping (APUD) mechanism that enables Top-K sparsification. Furthermore, global prototypes are aggregated via a weighted averaging strategy. Experimental results demonstrate that the proposed method achieves superior classification accuracy compared to existing prototype-based federated learning approaches on standard benchmarks.

Technology Category

Application Category

๐Ÿ“ Abstract
Federated learning (FL) enables collaborative model training without sharing raw data in edge environments, but is constrained by limited communication bandwidth and heterogeneous client data distributions. Prototype-based FL mitigates this issue by exchanging class-wise feature prototypes instead of full model parameters; however, existing methods still suffer from suboptimal generalization under severe communication constraints. In this paper, we propose RefProtoFL, a communication-efficient FL framework that integrates External-Referenced Prototype Alignment (ERPA) for representation consistency with Adaptive Probabilistic Update Dropping (APUD) for communication efficiency. Specifically, we decompose the model into a private backbone and a lightweight shared adapter, and restrict federated communication to the adapter parameters only. To further reduce uplink cost, APUD performs magnitude-aware Top-K sparsification, transmitting only the most significant adapter updates for server-side aggregation. To address representation inconsistency across heterogeneous clients, ERPA leverages a small server-held public dataset to construct external reference prototypes that serve as shared semantic anchors. For classes covered by public data, clients directly align local representations to public-induced prototypes, whereas for uncovered classes, alignment relies on server-aggregated global reference prototypes via weighted averaging. Extensive experiments on standard benchmarks demonstrate that RefProtoFL attains higher classification accuracy than state-of-the-art prototype-based FL baselines.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Communication Efficiency
Data Heterogeneity
Prototype Alignment
Edge Computing
Innovation

Methods, ideas, or system contributions that make the work stand out.

External-Referenced Prototype Alignment
Adaptive Probabilistic Update Dropping
Prototype-based Federated Learning
Communication Efficiency
Representation Consistency
๐Ÿ”Ž Similar Papers
No similar papers found.