A Unified Solution to Diverse Heterogeneities in One-shot Federated Learning

๐Ÿ“… 2024-10-28
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This paper addresses the challenge of jointly modeling model and data heterogeneity in one-shot federated learning (OSFL). We propose FedHydra, a data-free, single-round communication framework for heterogeneous-robust federated learning. Its core innovation is a novel two-stage heterogeneous co-governance mechanism: (i) hierarchical model design coupled with heterogeneity-aware layered aggregation, enabling simultaneous modeling and mitigation of both heterogeneity types under strict single-round constraints; and (ii) data-free knowledge distillation integrated with single-round parameter fusion optimization, achieving unified heterogeneous adaptation. Evaluated on four benchmark datasets, FedHydra consistently outperforms five state-of-the-art methodsโ€”improving average accuracy by 3.2โ€“7.8% under heterogeneous settings while maintaining superior performance in homogeneous scenarios.

Technology Category

Application Category

๐Ÿ“ Abstract
One-Shot Federated Learning (OSFL) restricts communication between the server and clients to a single round, significantly reducing communication costs and minimizing privacy leakage risks compared to traditional Federated Learning (FL), which requires multiple rounds of communication. However, existing OSFL frameworks remain vulnerable to distributional heterogeneity, as they primarily focus on model heterogeneity while neglecting data heterogeneity. To bridge this gap, we propose FedHydra, a unified, data-free, OSFL framework designed to effectively address both model and data heterogeneity. Unlike existing OSFL approaches, FedHydra introduces a novel two-stage learning mechanism. Specifically, it incorporates model stratification and heterogeneity-aware stratified aggregation to mitigate the challenges posed by both model and data heterogeneity. By this design, the data and model heterogeneity issues are simultaneously monitored from different aspects during learning. Consequently, FedHydra can effectively mitigate both issues by minimizing their inherent conflicts. We compared FedHydra with five SOTA baselines on four benchmark datasets. Experimental results show that our method outperforms the previous OSFL methods in both homogeneous and heterogeneous settings. Our code is available at https://anonymous.4open.science/r/Fed-SA-A4D7.
Problem

Research questions and friction points this paper is trying to address.

Addresses model and data heterogeneity in one-shot federated learning
Proposes a unified framework to mitigate distributional heterogeneity conflicts
Enhances performance in both homogeneous and heterogeneous settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified OSFL framework addressing model and data heterogeneity
Two-stage learning with model stratification and aggregation
Data-free approach minimizing conflicts in heterogeneous settings
๐Ÿ”Ž Similar Papers
No similar papers found.