UNICON: UNIfied CONtinual Learning for Medical Foundational Models

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Medical imaging data scarcity hinders continual adaptation of foundation models across modalities, tasks, and anatomical regions, leading to catastrophic forgetting and task interference. To address this, we propose the first perpetually extensible continual learning framework for medical foundation models, enabling coordinated evolution across modalities (e.g., CT/PET), tasks (e.g., classification, segmentation, prognostic prediction), and anatomical domains. Our approach integrates progressive parameter updating, knowledge consolidation, and task-dependency modeling to preserve prior capabilities while incorporating new ones. Unlike conventional isolated fine-tuning paradigms, our framework seamlessly extends a pre-trained chest CT foundation model to support PET imaging and novel clinical tasks without retraining from scratch. Empirical evaluation shows a 5% absolute improvement in segmentation Dice score, significant mitigation of catastrophic forgetting, and robust support for dynamic, real-world clinical deployment.

Technology Category

Application Category

📝 Abstract
Foundational models are trained on extensive datasets to capture the general trends of a domain. However, in medical imaging, the scarcity of data makes pre-training for every domain, modality, or task challenging. Continual learning offers a solution by fine-tuning a model sequentially on different domains or tasks, enabling it to integrate new knowledge without requiring large datasets for each training phase. In this paper, we propose UNIfied CONtinual Learning for Medical Foundational Models (UNICON), a framework that enables the seamless adaptation of foundation models to diverse domains, tasks, and modalities. Unlike conventional adaptation methods that treat these changes in isolation, UNICON provides a unified, perpetually expandable framework. Through careful integration, we show that foundation models can dynamically expand across imaging modalities, anatomical regions, and clinical objectives without catastrophic forgetting or task interference. Empirically, we validate our approach by adapting a chest CT foundation model initially trained for classification to a prognosis and segmentation task. Our results show improved performance across both additional tasks. Furthermore, we continually incorporated PET scans and achieved a 5% improvement in Dice score compared to respective baselines. These findings establish that foundation models are not inherently constrained to their initial training scope but can evolve, paving the way toward generalist AI models for medical imaging.
Problem

Research questions and friction points this paper is trying to address.

Adapting medical foundation models to new domains and tasks
Overcoming data scarcity in medical imaging continual learning
Preventing catastrophic forgetting during sequential fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified continual learning framework for medical imaging
Dynamic expansion across modalities and tasks
Prevents catastrophic forgetting and task interference
🔎 Similar Papers
No similar papers found.