FACT: foundation model for assessing cancer tissue margins with mass spectrometry.

📅 2025-04-04
🏛️ International Journal of Computer Assisted Radiology and Surgery
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address insufficient model generalization in intraoperative cancer tissue margin assessment—caused by scarcity of labeled data—this work introduces the first foundation model specifically designed for Rapid Evaporative Ionization Mass Spectrometry (REIMS) data. Methodologically, we propose a novel supervised contrastive pretraining strategy based on triplet loss and, for the first time, adapt a cross-modal foundation model architecture—originally developed for text-audio tasks—to mass spectrometry signal modeling. The resulting model substantially enhances discriminative performance under few-shot settings, achieving 82.4% ± 0.8 AUROC on real surgical REIMS data. This outperforms existing self-supervised and semi-supervised baselines, establishing new state-of-the-art performance for this task. The framework enables real-time, accurate intraoperative assessment of residual tumor tissue and provides a deployable paradigm for clinical translation.

Technology Category

Application Category

📝 Abstract
PURPOSE Accurately classifying tissue margins during cancer surgeries is crucial for ensuring complete tumor removal. Rapid Evaporative Ionization Mass Spectrometry (REIMS), a tool for real-time intraoperative margin assessment, generates spectra that require machine learning models to support clinical decision-making. However, the scarcity of labeled data in surgical contexts presents a significant challenge. This study is the first to develop a foundation model tailored specifically for REIMS data, addressing this limitation and advancing real-time surgical margin assessment. METHODS We propose FACT, a Foundation model for Assessing Cancer Tissue margins. FACT is an adaptation of a foundation model originally designed for text-audio association, pretrained using our proposed supervised contrastive approach based on triplet loss. An ablation study is performed to compare our proposed model against other models and pretraining methods. RESULTS Our proposed model significantly improves the classification performance, achieving state-of-the-art performance with an AUROC of 82.4 % ± 0.8 . The results demonstrate the advantage of our proposed pretraining method and selected backbone over the self-supervised and semi-supervised baselines and alternative models. CONCLUSION Our findings demonstrate that foundation models, adapted and pretrained using our novel approach, can effectively classify REIMS data even with limited labeled examples. This highlights the viability of foundation models for enhancing real-time surgical margin assessment, particularly in data-scarce clinical environments.
Problem

Research questions and friction points this paper is trying to address.

Accurately classifying cancer tissue margins during surgery
Overcoming labeled data scarcity in REIMS-based margin assessment
Developing a foundation model for real-time surgical decisions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adapts foundation model for REIMS data analysis
Uses supervised contrastive pretraining with triplet loss
Achieves state-of-the-art classification with 82.4% AUROC
🔎 Similar Papers
No similar papers found.
M
Mohammad Farahmand
Queen’s University, Kingston, ON, Canada
A
A. Jamzad
Queen’s University, Kingston, ON, Canada
F
Fahimeh Fooladgar
The University of British Columbia, Vancouver, BC, Canada
Laura Connolly
Laura Connolly
Malone Postdoctoral Fellow, Johns Hopkins University
Electrical engineeringroboticssurgical navigation
M
M. Kaufmann
Queen’s University, Kingston, ON, Canada
K
Kevin Yi Mi Ren
Queen’s University, Kingston, ON, Canada
J
J. Rudan
Queen’s University, Kingston, ON, Canada
D
D. McKay
Queen’s University, Kingston, ON, Canada
Gabor Fichtinger
Gabor Fichtinger
Professor and Canada Research Chair in Computer-Assisted Surgery, Queen's University, Canada
computer-assisted surgery and interventionsmedical roboticsmedical image computing
Parvin Mousavi
Parvin Mousavi
School of Computing, Queen's University
medical imagingimage guided interventionssystems biologybioinformatics