🤖 AI Summary
To address performance degradation and catastrophic forgetting in brain tumor segmentation caused by missing modalities and dynamically added modalities in clinical multi-modal MRI, this paper proposes a sustainable learning framework. Methodologically, we introduce CHSNet—a novel architecture integrating domain-incremental learning with cross-patient hypergraph modeling—to explicitly capture high-order correlations across multi-center and multi-modal data. We further design a Tversky-aware contrastive loss to jointly mitigate both inter-modal and intra-modal information imbalance, and incorporate a lightweight replay mechanism to preserve knowledge stability from previous tasks. Evaluated on BraTS2019, our method achieves over 2.0% Dice improvement in tumor region segmentation, significantly outperforming existing incremental and modality-robust approaches. The source code is publicly available.
📝 Abstract
Existing methods for multimodal MRI segmentation with missing modalities typically assume that all MRI modalities are available during training. However, in clinical practice, some modalities may be missing due to the sequential nature of MRI acquisition, leading to performance degradation. Furthermore, retraining models to accommodate newly available modalities can be inefficient and may cause overfitting, potentially compromising previously learned knowledge. To address these challenges, we propose Replay-based Hypergraph Domain Incremental Learning (ReHyDIL) for brain tumor segmentation with missing modalities. ReHyDIL leverages Domain Incremental Learning (DIL) to enable the segmentation model to learn from newly acquired MRI modalities without forgetting previously learned information. To enhance segmentation performance across diverse patient scenarios, we introduce the Cross-Patient Hypergraph Segmentation Network (CHSNet), which utilizes hypergraphs to capture high-order associations between patients. Additionally, we incorporate Tversky-Aware Contrastive (TAC) loss to effectively mitigate information imbalance both across and within different modalities. Extensive experiments on the BraTS2019 dataset demonstrate that ReHyDIL outperforms state-of-the-art methods, achieving an improvement of over 2% in the Dice Similarity Coefficient across various tumor regions. Our code is available at ReHyDIL.