Federated Domain Generalization with Data-free On-server Gradient Matching

📅 2025-01-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address weak cross-domain generalization in Federated Domain Generalization (FDG) caused by distributed heterogeneous domain data—and the reliance of conventional methods on client-side data uploads or additional communication overhead—this paper proposes a data-free, zero-communication, incremental server-side gradient-matching framework. Our method implicitly aligns gradient directions across domains at the server without requiring raw data or feature transmission from clients. Its core innovation is the first gradient inner-product maximization mechanism for learning domain-invariant gradient directions, enabling communication-free aggregation of domain-specific features. The approach is orthogonal to mainstream FL and FDG algorithms, offering plug-and-play compatibility. Extensive experiments on four federated learning benchmarks (MNIST, EMNIST, CIFAR-10, CIFAR-100) and three FDG benchmarks (PACS, VLCS, OfficeHome) demonstrate consistent and significant improvements over state-of-the-art methods.

Technology Category

Application Category

📝 Abstract
Domain Generalization (DG) aims to learn from multiple known source domains a model that can generalize well to unknown target domains. One of the key approaches in DG is training an encoder which generates domain-invariant representations. However, this approach is not applicable in Federated Domain Generalization (FDG), where data from various domains are distributed across different clients. In this paper, we introduce a novel approach, dubbed Federated Learning via On-server Matching Gradient (FedOMG), which can emph{efficiently leverage domain information from distributed domains}. Specifically, we utilize the local gradients as information about the distributed models to find an invariant gradient direction across all domains through gradient inner product maximization. The advantages are two-fold: 1) FedOMG can aggregate the characteristics of distributed models on the centralized server without incurring any additional communication cost, and 2) FedOMG is orthogonal to many existing FL/FDG methods, allowing for additional performance improvements by being seamlessly integrated with them. Extensive experimental evaluations on various settings to demonstrate the robustness of FedOMG compared to other FL/FDG baselines. Our method outperforms recent SOTA baselines on four FL benchmark datasets (MNIST, EMNIST, CIFAR-10, and CIFAR-100), and three FDG benchmark datasets (PACS, VLCS, and OfficeHome).
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Domain Generalization
Data Distribution
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Domain Generalization
Data Integration
Performance Enhancement
🔎 Similar Papers
No similar papers found.