Atlas is Your Perfect Context: One-Shot Customization for Generalizable Foundational Medical Image Segmentation

📅 2025-12-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing interactive foundation models (e.g., nnInteractive) suffer from poor generalization in rare clinical scenarios and rely heavily on labor-intensive, expert-crafted prompts, failing to meet the high-precision demands of medical image segmentation. To address this, we propose a novel framework featuring an atlas-guided, context-aware prompt generation mechanism and a test-time dual-path prediction fusion adapter—enabling zero-shot, single-instance-label-driven model customization without fine-tuning. Our method constructs an anatomical atlas via image registration to generate robust, anatomy-informed prompts, and integrates a lightweight adapter with multimodal foundation models for collaborative inference. Evaluated on a multicenter, multimodal, multi-organ dataset, our approach significantly improves segmentation accuracy—especially for small structures—while maintaining low deployment overhead and compatibility with real-time clinical workflows. Key innovations include the first atlas-prompt joint modeling framework and a fine-tuning-free, test-time adaptive mechanism.

Technology Category

Application Category

📝 Abstract
Accurate medical image segmentation is essential for clinical diagnosis and treatment planning. While recent interactive foundation models (e.g., nnInteractive) enhance generalization through large-scale multimodal pretraining, they still depend on precise prompts and often perform below expectations in contexts that are underrepresented in their training data. We present AtlasSegFM, an atlas-guided framework that customizes available foundation models to clinical contexts with a single annotated example. The core innovations are: 1) a pipeline that provides context-aware prompts for foundation models via registration between a context atlas and query images, and 2) a test-time adapter to fuse predictions from both atlas registration and the foundation model. Extensive experiments across public and in-house datasets spanning multiple modalities and organs demonstrate that AtlasSegFM consistently improves segmentation, particularly for small, delicate structures. AtlasSegFM provides a lightweight, deployable solution one-shot customization of foundation models in real-world clinical workflows. The code will be made publicly available.
Problem

Research questions and friction points this paper is trying to address.

Customizes foundation models for medical image segmentation with one example.
Improves segmentation of small structures using atlas-guided context prompts.
Enhances generalization in underrepresented clinical contexts via test-time adaptation.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Atlas-guided framework for one-shot customization
Context-aware prompts via atlas-query image registration
Test-time adapter fusing atlas and foundation model predictions
🔎 Similar Papers
2024-01-24Computer Vision and Pattern RecognitionCitations: 21