Federated Domain Generalization with Domain-specific Soft Prompts Generation

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing federated domain generalization (FDG) methods suffer from insufficient diversity in learned soft prompts and limited capability to model unseen domain knowledge, thereby constraining cross-domain generalization. To address this, we propose Generative Domain-Specific Prompts (GDSP), the first FDG framework to introduce a client-collaborative generative prompting mechanism. GDSP employs a soft prompt generator that explicitly models both content semantics and domain priors, enabling each client to generate domain-adaptive yet semantically consistent prompts locally. By unifying prompt learning, federated learning, and generative modeling, GDSP mitigates prompt homogenization and enhances adaptive inference for unseen domains. Extensive experiments on multiple benchmark datasets demonstrate significant improvements over state-of-the-art baselines, establishing new performance records for FDG tasks.

Technology Category

Application Category

📝 Abstract
Prompt learning has become an efficient paradigm for adapting CLIP to downstream tasks. Compared with traditional fine-tuning, prompt learning optimizes a few parameters yet yields highly competitive results, especially appealing in federated learning for computational efficiency. engendering domain shift among clients and posing a formidable challenge for downstream-task adaptation. Existing federated domain generalization (FDG) methods based on prompt learning typically learn soft prompts from training samples, replacing manually designed prompts to enhance the generalization ability of federated models. However, these learned prompts exhibit limited diversity and tend to ignore information from unknown domains. We propose a novel and effective method from a generative perspective for handling FDG tasks, namely federated domain generalization with domain-specific soft prompts generation (FedDSPG). Specifically, during training, we introduce domain-specific soft prompts (DSPs) for each domain and integrate content and domain knowledge into the generative model among clients. In the inference phase, the generator is utilized to obtain DSPs for unseen target domains, thus guiding downstream tasks in unknown domains. Comprehensive evaluations across several public datasets confirm that our method outperforms existing strong baselines in FDG, achieving state-of-the-art results.
Problem

Research questions and friction points this paper is trying to address.

Addresses domain shift challenges in federated learning
Generates domain-specific soft prompts for unseen domains
Enhances generalization ability across diverse client data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generates domain-specific soft prompts
Integrates content and domain knowledge
Uses generator for unseen domains
🔎 Similar Papers
No similar papers found.