Why Invariance is Not Enough for Biomedical Domain Generalization and How to Fix It

πŸ“… 2026-04-02
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This study addresses the significant performance degradation of 3D biomedical image segmentation models under distribution shifts caused by variations in imaging modality, disease severity, or clinical site, which undermines their clinical reliability. To tackle this challenge, the authors propose DropGenβ€”a lightweight, architecture- and loss-agnostic domain generalization method that seamlessly integrates with standard data augmentation. DropGen is the first to demonstrate that representation invariance alone is insufficient for robust generalization; instead, it enhances robustness by fusing intensity information from source-domain images with domain-stable representations from foundation models. Extensive experiments show that DropGen consistently outperforms existing methods across diverse distribution shift scenarios, achieving superior performance in both fully supervised and few-shot settings while remaining applicable to 3D segmentation tasks across arbitrary anatomical regions.
πŸ“ Abstract
We present DropGen, a simple and theoretically-grounded approach for domain generalization in 3D biomedical image segmentation. Modern segmentation models degrade sharply under shifts in modality, disease severity, clinical sites, and other factors, creating brittle models that limit reliable deployment. Existing domain generalization methods rely on extreme augmentations, mixing domain statistics, or architectural redesigns, yet incur significant implementation overhead and yield inconsistent performance across biomedical settings. DropGen instead proposes a principled learning strategy with minimal overhead that leverages both source-domain image intensities and domain-stable foundation model representations to train robust segmentation models. As a result, DropGen achieves strong gains in both fully supervised and few-shot segmentation across a broad range of shifts in biomedical studies. Unlike prior approaches, DropGen is architecture- and loss-agnostic, compatible with standard augmentation pipelines, computationally lightweight, and tackles arbitrary anatomical regions. Our implementation is freely available at https://github.com/sebodiaz/DropGen.
Problem

Research questions and friction points this paper is trying to address.

domain generalization
biomedical image segmentation
distribution shift
model robustness
3D medical imaging
Innovation

Methods, ideas, or system contributions that make the work stand out.

domain generalization
biomedical image segmentation
foundation models
DropGen
distribution shift
πŸ”Ž Similar Papers
No similar papers found.