pFedDSH: Enabling Knowledge Transfer in Personalized Federated Learning through Data-free Sub-Hypernetwork

📅 2025-08-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In personalized federated learning with dynamically joining clients, existing methods struggle to simultaneously preserve performance stability for incumbent clients and enable rapid personalization for newly joined clients, while lacking mechanisms for cross-batch knowledge transfer. Method: We propose a data-free continual adaptation framework built upon a centralized hypernetwork that generates client-specific subnetworks. To stabilize historical knowledge, we introduce batch-specific binary masks; to enable cross-batch knowledge transfer without raw data, we integrate DeepInversion for synthetic data replay. Contribution/Results: This is the first approach enabling continual personalized adaptation in dynamic federated settings without requiring original client data. Experiments on CIFAR-10, CIFAR-100, and Tiny-ImageNet demonstrate significant improvements over state-of-the-art methods—enhancing model resource efficiency, maintaining accuracy for legacy clients, and accelerating convergence for new clients.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) enables collaborative model training across distributed clients without sharing raw data, offering a significant privacy benefit. However, most existing Personalized Federated Learning (pFL) methods assume a static client participation, which does not reflect real-world scenarios where new clients may continuously join the federated system (i.e., dynamic client onboarding). In this paper, we explore a practical scenario in which a new batch of clients is introduced incrementally while the learning task remains unchanged. This dynamic environment poses various challenges, including preserving performance for existing clients without retraining and enabling efficient knowledge transfer between client batches. To address these issues, we propose Personalized Federated Data-Free Sub-Hypernetwork (pFedDSH), a novel framework based on a central hypernetwork that generates personalized models for each client via embedding vectors. To maintain knowledge stability for existing clients, pFedDSH incorporates batch-specific masks, which activate subsets of neurons to preserve knowledge. Furthermore, we introduce a data-free replay strategy motivated by DeepInversion to facilitate backward transfer, enhancing existing clients' performance without compromising privacy. Extensive experiments conducted on CIFAR-10, CIFAR-100, and Tiny-ImageNet demonstrate that pFedDSH outperforms the state-of-the-art pFL and Federated Continual Learning baselines in our investigation scenario. Our approach achieves robust performance stability for existing clients, as well as adaptation for new clients and efficient utilization of neural resources.
Problem

Research questions and friction points this paper is trying to address.

Dynamic client onboarding in federated learning systems
Preserving performance for existing clients without retraining
Efficient knowledge transfer between client batches
Innovation

Methods, ideas, or system contributions that make the work stand out.

Data-free sub-hypernetwork for personalized federated learning
Batch-specific masks to preserve client knowledge
Data-free replay strategy for backward transfer
🔎 Similar Papers
No similar papers found.
T
Thinh Nguyen
VinUni-Illinois Smart Health Center, VinUniversity, Hanoi, Vietnam; College of Engineering & Computer Science, VinUniversity, Hanoi, Vietnam
L
Le Huy Khiem
University of Notre Dame, Indiana, USA
V
Van-Tuan Tran
Technische Universität Berlin, Germany
Khoa D Doan
Khoa D Doan
VinUniversity
Generative ModelingInformation RetrievalComputational AdvertisingTrustworthy AI
N
Nitesh V Chawla
University of Notre Dame, Indiana, USA
K
Kok-Seng Wong
VinUni-Illinois Smart Health Center, VinUniversity, Hanoi, Vietnam; College of Engineering & Computer Science, VinUniversity, Hanoi, Vietnam