Towards Performance-Enhanced Model-Contrastive Federated Learning using Historical Information in Heterogeneous Scenarios

πŸ“… 2026-02-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenges of poor convergence and performance instability in federated learning under data heterogeneity and imbalanced client participation. To mitigate these issues, the authors propose the PMFL framework, which introduces a novel integration of historical local models into contrastive learning to enhance update consistency. The framework adaptively adjusts aggregation weights based on each client’s cumulative participation count and further incorporates historical global models to improve training stability. By synergistically combining model contrast, adaptive weighting, and historical model fusion, PMFL achieves significantly superior performance across diverse heterogeneous settings, effectively accelerating convergence and enhancing model generalization compared to existing approaches.

Technology Category

Application Category

πŸ“ Abstract
Federated Learning (FL) enables multiple nodes to collaboratively train a model without sharing raw data. However, FL systems are usually deployed in heterogeneous scenarios, where nodes differ in both data distributions and participation frequencies, which undermines the FL performance. To tackle the above issue, this paper proposes PMFL, a performance-enhanced model-contrastive federated learning framework using historical training information. Specifically, on the node side, we design a novel model-contrastive term into the node optimization objective by incorporating historical local models to capture stable contrastive points, thereby improving the consistency of model updates in heterogeneous data distributions. On the server side, we utilize the cumulative participation count of each node to adaptively adjust its aggregation weight, thereby correcting the bias in the global objective caused by different node participation frequencies. Furthermore, the updated global model incorporates historical global models to reduce its fluctuations in performance between adjacent rounds. Extensive experiments demonstrate that PMFL achieves superior performance compared with existing FL methods in heterogeneous scenarios.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Heterogeneous Scenarios
Data Distribution
Participation Frequency
Model Performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

model-contrastive learning
federated learning
heterogeneous scenarios
historical information
adaptive aggregation
πŸ”Ž Similar Papers
No similar papers found.
Hongliang Zhang
Hongliang Zhang
Professor of University of Shanghai for Science and Technology
Air pollutionsource apportionmentaerosolatmospheric science
J
Jiguo Yu
School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, 610054, China
G
Guijuan Wang
Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center, Qilu University of Technology (Shandong Academy of Sciences), Jinan, 250353, China
W
Wenshuo Ma
Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center, Qilu University of Technology (Shandong Academy of Sciences), Jinan, 250353, China
T
Tianqing He
School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, 610054, China
B
Baobao Chai
School of Computer Science and Engineering, Shandong University of Science and Technology, Jinan, 266590, China
Chunqiang Hu
Chunqiang Hu
Professor of Big Data & Software Engineering, Chongqing University.
Data-Driven Security and PrivacyAlgorithm Design and Analysis