🤖 AI Summary
Intraoperative margin assessment following breast-conserving surgery remains inaccurate, necessitating secondary surgeries in ~25% of patients; conventional two-dimensional specimen radiography (SR) lacks sufficient precision. To address this, we propose a novel framework integrating the Segment Anything Model (SAM) with forward-forward contrastive learning: using ResNet-18 as the backbone, local-global contrastive pretraining enhances patch-level discriminative capability under limited data, generating coarse masks to guide SAM for fine-grained margin segmentation. Our method achieves an AUC of 0.8455 on margin classification and improves segmentation Dice score by 27.4% over baselines, with inference time of only 47 ms per image. This work introduces forward-forward contrastive learning to medical margin analysis for the first time, achieving a favorable trade-off among high accuracy, computational efficiency, and robustness in low-data regimes.
📝 Abstract
Complete removal of cancer tumors with a negative specimen margin during lumpectomy is essential in reducing breast cancer recurrence. However, 2D specimen radiography (SR), the current method used to assess intraoperative specimen margin status, has limited accuracy, resulting in nearly a quarter of patients requiring additional surgery. To address this, we propose a novel deep learning framework combining the Segment Anything Model (SAM) with Forward-Forward Contrastive Learning (FFCL), a pre-training strategy leveraging both local and global contrastive learning for patch-level classification of SR images. After annotating SR images with regions of known maligancy, non-malignant tissue, and pathology-confirmed margins, we pre-train a ResNet-18 backbone with FFCL to classify margin status, then reconstruct coarse binary masks to prompt SAM for refined tumor margin segmentation. Our approach achieved an AUC of 0.8455 for margin classification and segmented margins with a 27.4% improvement in Dice similarity over baseline models, while reducing inference time to 47 milliseconds per image. These results demonstrate that FFCL-SAM significantly enhances both the speed and accuracy of intraoperative margin assessment, with strong potential to reduce re-excision rates and improve surgical outcomes in breast cancer treatment. Our code is available at https://github.com/tbwa233/FFCL-SAM/.