Efficient Self-Supervised Adaptation for Medical Image Analysis

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and GPU memory consumption of self-supervised adaptation (SSA) in medical image analysis, this paper proposes Efficient Self-Supervised Adaptation (ESSA), the first systematic exploration of parameter-efficient fine-tuning (PEFT) for SSA. Its core innovation is Attention Projection Layer Adaptation (APLA), a lightweight module integrated into self-supervised pretraining frameworks such as MAE and SimMIM, synergistically combined with techniques like LoRA. Experiments across diverse medical imaging tasks demonstrate that ESSA outperforms both full-parameter SSA and supervised fine-tuning in accuracy. Moreover, it reduces GPU memory usage by 40.1% and increases training throughput by 25.2%, without compromising inference accuracy or efficiency. This work establishes a new paradigm for lightweight, deployable self-supervised transfer learning in medical imaging.

Technology Category

Application Category

📝 Abstract
Self-supervised adaptation (SSA) improves foundation model transfer to medical domains but is computationally prohibitive. Although parameter efficient fine-tuning methods such as LoRA have been explored for supervised adaptation, their effectiveness for SSA remains unknown. In this work, we introduce efficient self-supervised adaptation (ESSA), a framework that applies parameter-efficient fine-tuning techniques to SSA with the aim of reducing computational cost and improving adaptation performance. Among the methods tested, Attention Projection Layer Adaptation (APLA) sets a new state-of-the-art, consistently surpassing full-parameter SSA and supervised fine-tuning across diverse medical tasks, while reducing GPU memory by up to 40.1% and increasing training throughput by 25.2%, all while maintaining inference efficiency.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational cost in self-supervised medical image adaptation
Evaluating parameter-efficient fine-tuning for self-supervised adaptation
Improving adaptation performance while maintaining inference efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-efficient fine-tuning for self-supervised adaptation
Attention Projection Layer Adaptation (APLA) method
Reduces GPU memory and increases training throughput
🔎 Similar Papers
No similar papers found.
M
Moein Sorkhei
KTH Royal Institute of Technology, Stockholm, Sweden; Science for Life Laboratory, Stockholm, Sweden
E
Emir Konuk
KTH Royal Institute of Technology, Stockholm, Sweden; Science for Life Laboratory, Stockholm, Sweden
Jingyu Guo
Jingyu Guo
Auransa
Genetics and GenomicsTranslational research with model system (Yeast)Bioinformatics
Christos Matsoukas
Christos Matsoukas
AstraZeneca
Artificial IntelligenceMachine LearningComputer VisionMedical Image Analysis
K
Kevin Smith
KTH Royal Institute of Technology, Stockholm, Sweden; Science for Life Laboratory, Stockholm, Sweden