Improving Artifact Robustness for CT Deep Learning Models Without Labeled Artifact Images via Domain Adaptation

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep learning models for CT image analysis exhibit poor generalization to unseen artifacts—such as ring artifacts induced by detector gain errors—not encountered during training. Method: We propose an unsupervised domain adaptation framework based on Domain-Adversarial Neural Networks (DANN), modeling ring artifacts directly in the sinogram domain. Leveraging the OrganAMNIST abdominal CT dataset, our approach enables robust model training without requiring labeled examples of novel artifacts. Contribution/Results: The method significantly improves classification robustness against unknown ring artifacts, matching the performance of fully supervised models trained on annotated artifact data and outperforming baselines trained solely on clean images or with conventional data augmentation. Notably, it also demonstrates cross-artifact generalization—unexpectedly enhancing robustness to uniform noise. This work provides a practical, cost-effective pathway to reduce annotation burden in clinical deployment while improving real-world model resilience.

Technology Category

Application Category

📝 Abstract
Deep learning models which perform well on images from their training distribution can degrade substantially when applied to new distributions. If a CT scanner introduces a new artifact not present in the training labels, the model may misclassify the images. Although modern CT scanners include design features which mitigate these artifacts, unanticipated or difficult-to-mitigate artifacts can still appear in practice. The direct solution of labeling images from this new distribution can be costly. As a more accessible alternative, this study evaluates domain adaptation as an approach for training models that maintain classification performance despite new artifacts, even without corresponding labels. We simulate ring artifacts from detector gain error in sinogram space and evaluate domain adversarial neural networks (DANN) against baseline and augmentation-based approaches on the OrganAMNIST abdominal CT dataset. Our results demonstrate that baseline models trained only on clean images fail to generalize to images with ring artifacts, and traditional augmentation with other distortion types provides no improvement on unseen artifact domains. In contrast, the DANN approach successfully maintains high classification accuracy on ring artifact images using only unlabeled artifact data during training, demonstrating the viability of domain adaptation for artifact robustness. The domain-adapted model achieved classification performance on ring artifact test data comparable to models explicitly trained with labeled artifact images, while also showing unexpected generalization to uniform noise. These findings provide empirical evidence that domain adaptation can effectively address distribution shift in medical imaging without requiring expensive expert labeling of new artifact distributions, suggesting promise for deployment in clinical settings where novel artifacts may emerge.
Problem

Research questions and friction points this paper is trying to address.

Maintaining CT model performance when encountering new scanner artifacts
Eliminating need for labeled artifact images through domain adaptation
Addressing distribution shift in medical imaging without expert labeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Domain adaptation for CT artifact robustness
Using unlabeled artifact data via DANN
Maintains accuracy without labeled artifact images
🔎 Similar Papers
No similar papers found.
J
Justin Cheung
Johns Hopkins University, Baltimore, Maryland, USA
S
Samuel Savine
Johns Hopkins University, Baltimore, Maryland, USA
C
Calvin Nguyen
Johns Hopkins University, Baltimore, Maryland, USA
Lin Lu
Lin Lu
PhD student, Nankai University
Conformal inferenceMultiple testing
A
Alhassan S. Yasin
Johns Hopkins University, Baltimore, Maryland, USA