Generalizable Cervical Cancer Screening via Large-scale Pretraining and Test-Time Adaptation

πŸ“… 2025-02-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the poor generalizability of AI-based cervical cancer screening systems across multi-center, multi-device, and multi-staining conditions, this study proposes Smart-CCS, a universally deployable intelligent screening system. Methodologically, it introduces a two-stage paradigm: (1) large-scale self-supervised pretraining using Vision Transformer (ViT) architecture, and (2) test-time adaptation incorporating batch normalization optimization and cross-domain feature alignment; it further integrates cell-level and whole-slide interpretable predictions. Evaluated on CCS-127Kβ€”the largest publicly available whole-slide cervical cytology dataset to date (127K images from 48 clinical centers)β€”Smart-CCS achieves an internal test AUC of 0.965, an average external validation AUC of 0.950 across six independent centers, and prospective validation AUCs of 0.947–0.986 across three centers, with sensitivity >0.91. All predictions were histopathologically confirmed.

Technology Category

Application Category

πŸ“ Abstract
Cervical cancer is a leading malignancy in female reproductive system. While AI-assisted cytology offers a cost-effective and non-invasive screening solution, current systems struggle with generalizability in complex clinical scenarios. To address this issue, we introduced Smart-CCS, a generalizable Cervical Cancer Screening paradigm based on pretraining and adaptation to create robust and generalizable screening systems. To develop and validate Smart-CCS, we first curated a large-scale, multi-center dataset named CCS-127K, which comprises a total of 127,471 cervical cytology whole-slide images collected from 48 medical centers. By leveraging large-scale self-supervised pretraining, our CCS models are equipped with strong generalization capability, potentially generalizing across diverse scenarios. Then, we incorporated test-time adaptation to specifically optimize the trained CCS model for complex clinical settings, which adapts and refines predictions, improving real-world applicability. We conducted large-scale system evaluation among various cohorts. In retrospective cohorts, Smart-CCS achieved an overall area under the curve (AUC) value of 0.965 and sensitivity of 0.913 for cancer screening on 11 internal test datasets. In external testing, system performance maintained high at 0.950 AUC across 6 independent test datasets. In prospective cohorts, our Smart-CCS achieved AUCs of 0.947, 0.924, and 0.986 in three prospective centers, respectively. Moreover, the system demonstrated superior sensitivity in diagnosing cervical cancer, confirming the accuracy of our cancer screening results by using histology findings for validation. Interpretability analysis with cell and slide predictions further indicated that the system's decision-making aligns with clinical practice. Smart-CCS represents a significant advancement in cancer screening across diverse clinical contexts.
Problem

Research questions and friction points this paper is trying to address.

Enhance cervical cancer screening generalizability
Utilize large-scale pretraining for AI models
Implement test-time adaptation for clinical settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Large-scale self-supervised pretraining
Test-time adaptation optimization
Multi-center dataset curation
πŸ”Ž Similar Papers
No similar papers found.
H
Hao Jiang
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.
C
Cheng Jin
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.
Huangjing Lin
Huangjing Lin
Imsight Medical Technology, Co., Ltd.
Medical Image AnalysisComputer VisionDeep LearningObject Detection and Segmentation
Yanning Zhou
Yanning Zhou
XPENG
computer visionmedical image analysis
X
Xi Wang
Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China.
J
Jiabo Ma
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.
L
Li Ding
Department of Pathology, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China.
J
Jun Hou
Center of Obstetrics and Gynecology, Peking University Shenzhen Hospital, Shenzhen, China.; Institute of Obstetrics and Gynecology, Shenzhen PKU-HKUST Medical Center, Shenzhen, China.; Shenzhen Key Laboratory on Technology for Early Diagnosis of Major Gynecologic Diseases, Shenzhen, China.
R
Runsheng Liu
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.
Z
Zhizhong Chai
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.
L
Luyang Luo
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.; Department of Biomedical Informatics, Harvard University, USA.
H
Huijuan Shi
Department of Pathology, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China.
Y
Yinling Qian
Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
Q
Qiong Wang
Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China.
C
Changzhong Li
Center of Obstetrics and Gynecology, Peking University Shenzhen Hospital, Shenzhen, China.; Institute of Obstetrics and Gynecology, Shenzhen PKU-HKUST Medical Center, Shenzhen, China.; Shenzhen Key Laboratory on Technology for Early Diagnosis of Major Gynecologic Diseases, Shenzhen, China.
A
Anjia Han
Department of Pathology, The First Affiliated Hospital, Sun Yat-sen University, Guangzhou, China.
R
R. Chan
Department of Anatomical and Cellular Pathology, The Chinese University of Hong Kong, Hong Kong SAR, China.
H
Hao Chen
Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.; Department of Chemical and Biological Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR, China.; HKUST Shenzhen-Hong Kong Collaborative Innovation Research Institute, Shenzhen, China.; Division of Life Science, The Hong Kong University of Science and Technology, Hong Kong SAR, China.; State Key Laboratory of Molecular Neuroscience, The Hong Kong University o