Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training

📅 2024-12-16
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address poor domain generalization of models in federated learning caused by client data heterogeneity, this paper proposes a novel approach integrating label smoothing with decentralized training equilibrium. We design a client-level label smoothing mechanism to mitigate domain-specific overfitting and introduce a decentralized optimization framework based on dynamic budget allocation to adaptively balance communication and computational loads across clients. The method operates without a central server, thereby enhancing privacy preservation and system robustness. Evaluated on four major cross-domain benchmarks—PACS, VLCS, OfficeHome, and TerraInc—the proposed method achieves state-of-the-art performance across three core metrics. Notably, it significantly improves the average generalization accuracy of the global model, demonstrating both effectiveness and broad applicability under non-i.i.d. data distributions.

Technology Category

Application Category

📝 Abstract
In this paper, we propose a novel approach, Federated Domain Generalization with Label Smoothing and Balanced Decentralized Training (FedSB), to address the challenges of data heterogeneity within a federated learning framework. FedSB utilizes label smoothing at the client level to prevent overfitting to domain-specific features, thereby enhancing generalization capabilities across diverse domains when aggregating local models into a global model. Additionally, FedSB incorporates a decentralized budgeting mechanism which balances training among clients, which is shown to improve the performance of the aggregated global model. Extensive experiments on four commonly used multi-domain datasets, PACS, VLCS, OfficeHome, and TerraInc, demonstrate that FedSB outperforms competing methods, achieving state-of-the-art results on three out of four datasets, indicating the effectiveness of FedSB in addressing data heterogeneity.
Problem

Research questions and friction points this paper is trying to address.

Address data heterogeneity in federated learning.
Enhance generalization across diverse domains.
Balance training among decentralized clients.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated learning framework
Label smoothing technique
Decentralized budgeting mechanism
🔎 Similar Papers
No similar papers found.