Detection of Breast Cancer Lumpectomy Margin with SAM-incorporated Forward-Forward Contrastive Learning

📅 2025-06-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Intraoperative margin assessment following breast-conserving surgery remains inaccurate, necessitating secondary surgeries in ~25% of patients; conventional two-dimensional specimen radiography (SR) lacks sufficient precision. To address this, we propose a novel framework integrating the Segment Anything Model (SAM) with forward-forward contrastive learning: using ResNet-18 as the backbone, local-global contrastive pretraining enhances patch-level discriminative capability under limited data, generating coarse masks to guide SAM for fine-grained margin segmentation. Our method achieves an AUC of 0.8455 on margin classification and improves segmentation Dice score by 27.4% over baselines, with inference time of only 47 ms per image. This work introduces forward-forward contrastive learning to medical margin analysis for the first time, achieving a favorable trade-off among high accuracy, computational efficiency, and robustness in low-data regimes.

Technology Category

Application Category

📝 Abstract
Complete removal of cancer tumors with a negative specimen margin during lumpectomy is essential in reducing breast cancer recurrence. However, 2D specimen radiography (SR), the current method used to assess intraoperative specimen margin status, has limited accuracy, resulting in nearly a quarter of patients requiring additional surgery. To address this, we propose a novel deep learning framework combining the Segment Anything Model (SAM) with Forward-Forward Contrastive Learning (FFCL), a pre-training strategy leveraging both local and global contrastive learning for patch-level classification of SR images. After annotating SR images with regions of known maligancy, non-malignant tissue, and pathology-confirmed margins, we pre-train a ResNet-18 backbone with FFCL to classify margin status, then reconstruct coarse binary masks to prompt SAM for refined tumor margin segmentation. Our approach achieved an AUC of 0.8455 for margin classification and segmented margins with a 27.4% improvement in Dice similarity over baseline models, while reducing inference time to 47 milliseconds per image. These results demonstrate that FFCL-SAM significantly enhances both the speed and accuracy of intraoperative margin assessment, with strong potential to reduce re-excision rates and improve surgical outcomes in breast cancer treatment. Our code is available at https://github.com/tbwa233/FFCL-SAM/.
Problem

Research questions and friction points this paper is trying to address.

Improving accuracy of breast cancer lumpectomy margin detection
Reducing re-excision rates with faster intraoperative assessment
Combining SAM and FFCL for enhanced tumor margin segmentation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines SAM with Forward-Forward Contrastive Learning
Uses FFCL for patch-level SR image classification
Reconstructs binary masks to refine SAM segmentation
🔎 Similar Papers
No similar papers found.
Tyler Ward
Tyler Ward
Student Teaching Assistant, University of Kentucky
Computer visionMachine learningMedical imagingQuality engineering
X
Xiaoqin Wang
University of Kentucky, USA
B
Braxton McFarland
University of Kentucky, USA
Md Atik Ahamed
Md Atik Ahamed
University of Kentucky
machine learning
S
Sahar Nozad
University of Kentucky, USA
T
Talal Arshad
University of Kentucky, USA
H
Hafsa Nebbache
University of Kentucky, USA
J
Jin Chen
The University of Alabama at Birmingham, USA
A
Abdullah Imran
University of Kentucky, USA