🤖 AI Summary
Existing cell segmentation methods either rely on manual parameter tuning and strong shape assumptions or require large amounts of annotated data, limiting generalizability to novel cell types, noisy images, and those with severe intensity inhomogeneity. This paper introduces the first unsupervised cell segmentation framework based on fast Gaussian processes—requiring no labels, zero parameter tuning, and making no prior assumptions about cell morphology. Our key contributions are: (1) modeling the joint spatial-intensity distribution of pixels via a scalable Gaussian process; (2) designing an adaptive, robust thresholding criterion that suppresses noise while preserving weak boundaries; and (3) integrating an optimized watershed algorithm for accurate separation of touching cells. Extensive evaluation on synthetic benchmarks and large-scale real microscopy datasets demonstrates that our method significantly outperforms state-of-the-art supervised and traditional unsupervised approaches in segmentation accuracy, robustness to noise and intensity variation, and computational scalability.
📝 Abstract
Cell boundary information is crucial for analyzing cell behaviors from time-lapse microscopy videos. Existing supervised cell segmentation tools, such as ImageJ, require tuning various parameters and rely on restrictive assumptions about the shape of the objects. While recent supervised segmentation tools based on convolutional neural networks enhance accuracy, they depend on high-quality labelled images, making them unsuitable for segmenting new types of objects not in the database. We developed a novel unsupervised cell segmentation algorithm based on fast Gaussian processes for noisy microscopy images without the need for parameter tuning or restrictive assumptions about the shape of the object. We derived robust thresholding criteria adaptive for heterogeneous images containing distinct brightness at different parts to separate objects from the background, and employed watershed segmentation to distinguish touching cell objects. Both simulated studies and real-data analysis of large microscopy images demonstrate the scalability and accuracy of our approach compared with the alternatives.