Decentralized Federated Dataset Dictionary Learning for Multi-Source Domain Adaptation

📅 2025-03-22
🏛️ IEEE International Conference on Acoustics, Speech, and Signal Processing
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of decentralized multi-source domain adaptation in heterogeneous environments without a central server, aiming to transfer knowledge from multiple labeled source domains to an unlabeled target domain while preserving data privacy, ensuring system robustness, and supporting scalability. We propose the first fully decentralized federated dataset dictionary learning framework. Our method is the first to incorporate the Wasserstein barycenter into a decentralized setting to explicitly model cross-domain distribution shifts, and integrates distributed optimization with domain-invariant feature alignment. Experiments demonstrate that our approach significantly outperforms both federated and state-of-the-art decentralized baselines on multi-source domain adaptation tasks. It achieves stable convergence, incurs low communication overhead, and eliminates single-point-of-failure risks entirely.

Technology Category

Application Category

📝 Abstract
Decentralized Multi-Source Domain Adaptation (DMSDA) is a challenging task that aims to transfer knowledge from multiple related and heterogeneous source domains to an unlabeled target domain within a decentralized framework. Our work tackles DMSDA through a fully decentralized federated approach. In particular, we extend the Federated Dataset Dictionary Learning (FedDaDiL) framework by eliminating the necessity for a central server. FedDaDiL leverages Wasserstein barycenters to model the distributional shift across multiple clients, enabling effective adaptation while preserving data privacy. By decentralizing this framework, we enhance its robustness, scalability, and privacy, removing the risk of a single point of failure. We compare our method to its federated counterpart and other benchmark algorithms, showing that our approach effectively adapts source domains to an unlabeled target domain in a fully decentralized manner.
Problem

Research questions and friction points this paper is trying to address.

Decentralized knowledge transfer from multiple heterogeneous sources
Privacy-preserving domain adaptation without central server
Robust federated learning for unlabeled target domains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decentralized federated learning without central server
Wasserstein barycenters for distributional shift modeling
Enhanced robustness, scalability, and privacy preservation
🔎 Similar Papers
No similar papers found.
R
Rebecca Clain
CEA-List, Université Paris-Saclay, F-91120 Palaiseau, France
Eduardo Fernandes Montesuma
Eduardo Fernandes Montesuma
PhD, AI Researcher, SigmaNova
Optimal TransportTransfer Learning
F
Fred Maurice Ngolè Mboula
CEA-List, Université Paris-Saclay, F-91120 Palaiseau, France