🤖 AI Summary
This study addresses the problem of spatial misalignment in longitudinal mammographic images, which obscures true tissue changes and hampers accurate breast cancer risk assessment. To investigate the impact of temporal image alignment on deep learning model performance, we propose a novel alignment paradigm that explicitly incorporates image-level deformation fields into the feature space—distinct from conventional feature-level or implicit alignment approaches. Our method is trained end-to-end and evaluated on two large-scale mammography datasets. Results demonstrate that image-level registration achieves superior anatomical plausibility in deformation fields and, critically, that its derived feature-space deformation augmentation significantly improves risk prediction performance—yielding the highest accuracy, precision, and recall. The core contribution lies in establishing that anatomically consistent deformation modeling is essential for enhancing model robustness and discriminative capability, thereby enabling interpretable, high-performance temporal modeling for personalized breast cancer screening.
📝 Abstract
Regular mammography screening is crucial for early breast cancer detection. By leveraging deep learning-based risk models, screening intervals can be personalized, especially for high-risk individuals. While recent methods increasingly incorporate longitudinal information from prior mammograms, accurate spatial alignment across time points remains a key challenge. Misalignment can obscure meaningful tissue changes and degrade model performance. In this study, we provide insights into various alignment strategies, image-based registration, feature-level (representation space) alignment with and without regularization, and implicit alignment methods, for their effectiveness in longitudinal deep learning-based risk modeling. Using two large-scale mammography datasets, we assess each method across key metrics, including predictive accuracy, precision, recall, and deformation field quality. Our results show that image-based registration consistently outperforms the more recently favored feature-based and implicit approaches across all metrics, enabling more accurate, temporally consistent predictions and generating smooth, anatomically plausible deformation fields. Although regularizing the deformation field improves deformation quality, it reduces the risk prediction performance of feature-level alignment. Applying image-based deformation fields within the feature space yields the best risk prediction performance. These findings underscore the importance of image-based deformation fields for spatial alignment in longitudinal risk modeling, offering improved prediction accuracy and robustness. This approach has strong potential to enhance personalized screening and enable earlier interventions for high-risk individuals. The code is available at https://github.com/sot176/Mammogram_Alignment_Study_Risk_Prediction.git, allowing full reproducibility of the results.