🤖 AI Summary
Existing voxel-wise uncertainty quantification metrics neglect spatial context and anatomical boundaries, limiting their ability to distinguish diffuse from boundary-aligned uncertainty patterns and undermining clinical interpretability. To address this, we propose the first uncertainty evaluation framework that explicitly integrates 3D spatial structure and anatomical boundary information. Our method leverages voxel neighborhood statistics, directed boundary distance field modeling, and region-wise connectivity analysis to characterize the spatial distribution of segmentation uncertainty. Evaluated on the MSD prostate dataset, the proposed metrics demonstrate significantly improved correlation with clinically relevant factors—such as contour reliability and regional consistency—with Pearson correlation coefficients increased by ≥0.23. Moreover, our framework accurately identifies clinically meaningful uncertainty patterns. This work establishes a novel, interpretable, and structure-aware paradigm for assessing trustworthiness in medical image segmentation.
📝 Abstract
Uncertainty maps highlight unreliable regions in segmentation predictions. However, most uncertainty evaluation metrics treat voxels independently, ignoring spatial context and anatomical structure. As a result, they may assign identical scores to qualitatively distinct patterns (e.g., scattered vs. boundary-aligned uncertainty). We propose three spatially aware metrics that incorporate structural and boundary information and conduct a thorough validation on medical imaging data from the prostate zonal segmentation challenge within the Medical Segmentation Decathlon. Our results demonstrate improved alignment with clinically important factors and better discrimination between meaningful and spurious uncertainty patterns.