DIsoN: Decentralized Isolation Networks for Out-of-Distribution Detection in Medical Imaging

📅 2025-06-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the practical bottleneck in out-of-distribution (OOD) detection for privacy-sensitive medical imaging—namely, the requirement of accessing original training data—we propose a decentralized isolation network framework. Our method eliminates raw-data transmission by enabling distribution comparison between training and test sets solely through parameter exchange. It introduces a class-conditional isolation mechanism to enhance OOD discrimination accuracy and clinical interpretability. Integrating federated parameter collaboration, class-conditional feature alignment, and a lightweight discriminator design, the framework supports multimodal medical images and mainstream pretrained backbones. Evaluated on four medical imaging datasets across 12 OOD detection tasks, our approach achieves, without any training data leakage, an average 12.3% reduction in FPR95 and a 5.8% improvement in AUROC—significantly outperforming existing state-of-the-art methods.

Technology Category

Application Category

📝 Abstract
Safe deployment of machine learning (ML) models in safety-critical domains such as medical imaging requires detecting inputs with characteristics not seen during training, known as out-of-distribution (OOD) detection, to prevent unreliable predictions. Effective OOD detection after deployment could benefit from access to the training data, enabling direct comparison between test samples and the training data distribution to identify differences. State-of-the-art OOD detection methods, however, either discard training data after deployment or assume that test samples and training data are centrally stored together, an assumption that rarely holds in real-world settings. This is because shipping training data with the deployed model is usually impossible due to the size of training databases, as well as proprietary or privacy constraints. We introduce the Isolation Network, an OOD detection framework that quantifies the difficulty of separating a target test sample from the training data by solving a binary classification task. We then propose Decentralized Isolation Networks (DIsoN), which enables the comparison of training and test data when data-sharing is impossible, by exchanging only model parameters between the remote computational nodes of training and deployment. We further extend DIsoN with class-conditioning, comparing a target sample solely with training data of its predicted class. We evaluate DIsoN on four medical imaging datasets (dermatology, chest X-ray, breast ultrasound, histopathology) across 12 OOD detection tasks. DIsoN performs favorably against existing methods while respecting data-privacy. This decentralized OOD detection framework opens the way for a new type of service that ML developers could provide along with their models: providing remote, secure utilization of their training data for OOD detection services. Code will be available upon acceptance at: *****
Problem

Research questions and friction points this paper is trying to address.

Detects out-of-distribution samples in medical imaging
Enables OOD detection without centralized data sharing
Preserves data privacy while comparing test and training data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decentralized Isolation Networks for OOD detection
Exchanges model parameters without data-sharing
Class-conditioning compares samples with predicted class
🔎 Similar Papers