FedRD: Reducing Divergences for Generalized Federated Learning via Heterogeneity-aware Parameter Guidance

πŸ“… 2026-01-28
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of generalization to unseen clients in heterogeneous federated learning by explicitly identifying and jointly tackling two key issues: optimization divergence and performance divergence. To this end, the authors propose FedRD, a novel algorithm that employs a heterogeneity-aware parameter guidance mechanism to jointly optimize global model aggregation and local debiased classifier training. This approach effectively narrows the performance gap between participating and unseen clients. Extensive experiments on multiple public multi-domain datasets demonstrate that FedRD significantly outperforms existing methods, substantially enhancing the model’s generalization capability to newly joined, heterogeneous clients.

Technology Category

Application Category

πŸ“ Abstract
Heterogeneous federated learning (HFL) aims to ensure effective and privacy-preserving collaboration among different entities. As newly joined clients require significant adjustments and additional training to align with the existing system, the problem of generalizing federated learning models to unseen clients under heterogeneous data has become progressively crucial. Consequently, we highlight two unsolved challenging issues in federated domain generalization: Optimization Divergence and Performance Divergence. To tackle the above challenges, we propose FedRD, a novel heterogeneity-aware federated learning algorithm that collaboratively utilizes parameter-guided global generalization aggregation and local debiased classification to reduce divergences, aiming to obtain an optimal global model for participating and unseen clients. Extensive experiments on public multi-domain datasets demonstrate that our approach exhibits a substantial performance advantage over competing baselines in addressing this specific problem.
Problem

Research questions and friction points this paper is trying to address.

Heterogeneous Federated Learning
Federated Domain Generalization
Optimization Divergence
Performance Divergence
Unseen Clients
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Domain Generalization
Heterogeneous Federated Learning
Parameter-guided Aggregation
Optimization Divergence
Local Debiasing
πŸ”Ž Similar Papers
No similar papers found.