From Images to Insights: Transforming Brain Cancer Diagnosis with Explainable AI

📅 2025-01-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of uneven physician expertise, low diagnostic efficiency, and poor interpretability of existing methods in brain cancer diagnosis within low-resource settings, this study proposes a novel, clinically grounded deep learning framework. We construct and publicly release— for the first time—the Bangladesh Multi-Category Brain Cancer MRI Dataset, comprising 6,056 annotated scans across three classes: brain tumor, glioma, and meningioma—filling a critical gap in high-quality, publicly available data for low-resource regions. Our method employs a DenseNet169-based automatic classification model integrated with multiple eXplainable AI (XAI) techniques—GradCAM, GradCAM++, ScoreCAM, and LayerCAM—to jointly optimize diagnostic accuracy and clinical interpretability. The model achieves exceptional performance: accuracy, precision, recall, and F1-score all reach 0.9983. XAI visualizations precisely localize pathological regions, substantially enhancing decision transparency, clinician trust, and real-world clinical applicability.

Technology Category

Application Category

📝 Abstract
Brain cancer represents a major challenge in medical diagnostics, requisite precise and timely detection for effective treatment. Diagnosis initially relies on the proficiency of radiologists, which can cause difficulties and threats when the expertise is sparse. Despite the use of imaging resources, brain cancer remains often difficult, time-consuming, and vulnerable to intraclass variability. This study conveys the Bangladesh Brain Cancer MRI Dataset, containing 6,056 MRI images organized into three categories: Brain Tumor, Brain Glioma, and Brain Menin. The dataset was collected from several hospitals in Bangladesh, providing a diverse and realistic sample for research. We implemented advanced deep learning models, and DenseNet169 achieved exceptional results, with accuracy, precision, recall, and F1-Score all reaching 0.9983. In addition, Explainable AI (XAI) methods including GradCAM, GradCAM++, ScoreCAM, and LayerCAM were employed to provide visual representations of the decision-making processes of the models. In the context of brain cancer, these techniques highlight DenseNet169's potential to enhance diagnostic accuracy while simultaneously offering transparency, facilitating early diagnosis and better patient outcomes.
Problem

Research questions and friction points this paper is trying to address.

Brain Cancer Diagnosis
Accuracy and Efficiency
Medical Professional Variability
Innovation

Methods, ideas, or system contributions that make the work stand out.

DenseNet169
Interpretable AI
Brain Tumor Classification
🔎 Similar Papers
No similar papers found.
M
Md. Arafat Alam Khandaker
Department of Computer Science and Engineering, Ahsanullah University of Science and Technology, Dhaka, Bangladesh
Ziyan Shirin Raha
Ziyan Shirin Raha
Adjunct Lecturer of Department of CSE, Southeast University, Bangladesh
Machine LearningDeep LearningComputer VisionNatural Language Processing
S
Salehin Bin Iqbal
Department of Computer Science and Engineering, Ahsanullah University of Science and Technology, Dhaka, Bangladesh
M
M. F. Mridha
Department of Computer Science, American International University Bangladesh, Dhaka, Bangladesh
J
Jungpil Shin
Department of Computer Science and Engineering, University of Aizu, Aizuwakamatsu, Japan