🤖 AI Summary
This work addresses a critical limitation in existing semi-supervised medical image segmentation methods, which often employ rectangular region shifts that ignore anatomical structures, leading to boundary distortions and semantic inconsistencies. To overcome this, the authors propose the UCAD framework, which uniquely integrates contour-aware superpixel shifting with an uncertainty-guided mechanism. Specifically, superpixels are leveraged to generate regions that conform to anatomical boundaries, while uncertainty estimates dynamically identify challenging regions for targeted augmentation. Additionally, a dynamically weighted consistency loss is introduced to enhance model generalization on unlabeled data. Experimental results demonstrate that UCAD significantly outperforms state-of-the-art methods under limited annotation settings, achieving superior segmentation accuracy and improved boundary fidelity.
📝 Abstract
Existing displacement strategies in semi-supervised segmentation only operate on rectangular regions, ignoring anatomical structures and resulting in boundary distortions and semantic inconsistency. To address these issues, we propose UCAD, an Uncertainty-Guided Contour-Aware Displacement framework for semi-supervised medical image segmentation that preserves contour-aware semantics while enhancing consistency learning. Our UCAD leverages superpixels to generate anatomically coherent regions aligned with anatomy boundaries, and an uncertainty-guided selection mechanism to selectively displace challenging regions for better consistency learning. We further propose a dynamic uncertainty-weighted consistency loss, which adaptively stabilizes training and effectively regularizes the model on unlabeled regions. Extensive experiments demonstrate that UCAD consistently outperforms state-of-the-art semi-supervised segmentation methods, achieving superior segmentation accuracy under limited annotation. The code is available at:https://github.com/dcb937/UCAD.