CellStyle: Improved Zero-Shot Cell Segmentation via Style Transfer

๐Ÿ“… 2025-03-11
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Cell segmentation in microscopy images suffers from severe domain shifts across imaging devices, staining protocols, and cell types, compounded by scarce annotated dataโ€”resulting in poor zero-shot generalization of existing models. To address this, we propose the first shape-preserving unsupervised style transfer framework tailored for cell segmentation. Our method decouples texture (e.g., stain appearance, noise, color) from shape via semantic mask guidance, transferring target-domain style onto labeled source images while strictly preserving cellular morphology. The resulting stylized synthetic images are directly usable for fine-tuning without requiring target-domain annotations or architectural modifications to downstream segmenters (e.g., U-Net, SAM). Evaluated on six heterogeneous datasets, our approach achieves an average 18.7% Dice improvement over state-of-the-art zero-shot methods; after fine-tuning, performance reaches 75% of fully supervised baselines.

Technology Category

Application Category

๐Ÿ“ Abstract
Cell microscopy data are abundant; however, corresponding segmentation annotations remain scarce. Moreover, variations in cell types, imaging devices, and staining techniques introduce significant domain gaps between datasets. As a result, even large, pretrained segmentation models trained on diverse datasets (source datasets) struggle to generalize to unseen datasets (target datasets). To overcome this generalization problem, we propose CellStyle, which improves the segmentation quality of such models without requiring labels for the target dataset, thereby enabling zero-shot adaptation. CellStyle transfers the attributes of an unannotated target dataset, such as texture, color, and noise, to the annotated source dataset. This transfer is performed while preserving the cell shapes of the source images, ensuring that the existing source annotations can still be used while maintaining the visual characteristics of the target dataset. The styled synthetic images with the existing annotations enable the finetuning of a generalist segmentation model for application to the unannotated target data. We demonstrate that CellStyle significantly improves zero-shot cell segmentation performance across diverse datasets by finetuning multiple segmentation models on the style-transferred data. The code will be made publicly available.
Problem

Research questions and friction points this paper is trying to address.

Addresses generalization issues in cell segmentation models
Enables zero-shot adaptation without target dataset labels
Improves segmentation by transferring target dataset attributes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Style transfer for zero-shot cell segmentation
Preserves cell shapes while adapting visual attributes
Finetunes models using style-transferred synthetic images
๐Ÿ”Ž Similar Papers
No similar papers found.
R
Ruveyda Yilmaz
Institute of Imaging and Computer Vision, RWTH Aachen University, Germany
Z
Zhu Chen
Institute of Imaging and Computer Vision, RWTH Aachen University, Germany
Yuli Wu
Yuli Wu
RWTH Aachen University
Computer VisionRetinal Prosthesis
Johannes Stegmaier
Johannes Stegmaier
RWTH Aachen University
3D+t Image AnalysisMachine LearningMicroscopyDevelopmental BiologyMedical Image Analysis