Improving Learning of New Diseases through Knowledge-Enhanced Initialization for Federated Adapter Tuning

📅 2025-08-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In medical federated learning (FL), clients must rapidly adapt to novel disease diagnosis tasks but lack mechanisms for cross-task and cross-client knowledge reuse. Method: We propose a knowledge-enhanced adapter initialization framework that integrates global clustering with a bilevel optimization strategy: an upper-level optimizer dynamically learns inter-cluster aggregation weights to model cross-client knowledge distributions, while a lower-level optimizer refines intra-cluster personalized weights to enable cross-task transfer; adapters are thus hierarchically initialized to accelerate cold-start adaptation. The framework supports multimodal medical data and enables efficient large-model adaptation in FL settings. Contribution/Results: Evaluated on three benchmark medical datasets—dermatology, chest X-ray, and retinal OCT—our method achieves significantly higher diagnostic accuracy for unseen diseases compared to existing FL adapter approaches, demonstrating superior generalizability and practical utility in real-world clinical federated scenarios.

Technology Category

Application Category

📝 Abstract
In healthcare, federated learning (FL) is a widely adopted framework that enables privacy-preserving collaboration among medical institutions. With large foundation models (FMs) demonstrating impressive capabilities, using FMs in FL through cost-efficient adapter tuning has become a popular approach. Given the rapidly evolving healthcare environment, it is crucial for individual clients to quickly adapt to new tasks or diseases by tuning adapters while drawing upon past experiences. In this work, we introduce Federated Knowledge-Enhanced Initialization (FedKEI), a novel framework that leverages cross-client and cross-task transfer from past knowledge to generate informed initializations for learning new tasks with adapters. FedKEI begins with a global clustering process at the server to generalize knowledge across tasks, followed by the optimization of aggregation weights across clusters (inter-cluster weights) and within each cluster (intra-cluster weights) to personalize knowledge transfer for each new task. To facilitate more effective learning of the inter- and intra-cluster weights, we adopt a bi-level optimization scheme that collaboratively learns the global intra-cluster weights across clients and optimizes the local inter-cluster weights toward each client's task objective. Extensive experiments on three benchmark datasets of different modalities, including dermatology, chest X-rays, and retinal OCT, demonstrate FedKEI's advantage in adapting to new diseases compared to state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Enhancing new disease learning via knowledge-enhanced federated adapter tuning
Leveraging past knowledge for efficient initialization in federated learning
Optimizing cluster weights for personalized medical task adaptation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Knowledge-Enhanced Initialization for adapters
Bi-level optimization for cluster weights
Global clustering for cross-task knowledge
🔎 Similar Papers
No similar papers found.
Danni Peng
Danni Peng
A*STAR, Singapore
Continual LearningMeta LearningDomain GeneralizationFederated Learning
Y
Yuan Wang
Institute of High Performance Computing (IHPC), Agency for Science, Technology and Research (A*STAR), Singapore
K
Kangning Cai
EVYD Technology
P
Peiyan Ning
EVYD Technology
Jiming Xu
Jiming Xu
Ant Group
side-channel attacksprivacy-preserving computation
Y
Yong Liu
Institute of High Performance Computing (IHPC), Agency for Science, Technology and Research (A*STAR), Singapore
R
Rick Siow Mong Goh
Institute of High Performance Computing (IHPC), Agency for Science, Technology and Research (A*STAR), Singapore
Qingsong Wei
Qingsong Wei
Principal Scientist of Institute of High Performance Computing, A*STAR
Federated LearningPrivacy-preserving Machine LearningBlockchainDecentralized Computing
Huazhu Fu
Huazhu Fu
Principal Scientist, IHPC, A*STAR
Medical Image AnalysisAI for HealthcareMedical AITrustworthy AI