PDD: Manifold-Prior Diverse Distillation for Medical Anomaly Detection

📅 2026-03-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of medical image anomaly detection, where anomalies are often subtle, highly heterogeneous, and embedded within complex anatomical structures, rendering conventional discriminative activation map methods ineffective. To this end, the authors propose the PDD framework, which uniquely integrates manifold priors to unify global contextual and local structural knowledge from dual teacher models. A diversity-aware distillation strategy is introduced to prevent representation collapse while preserving high sensitivity. The framework employs VMamba-Tiny and Wide-ResNet50 as dual teacher encoders, coupled with MMU, InA, and MPA modules to facilitate manifold matching, feature adaptation, and cross-layer distillation. Extensive experiments on datasets such as HeadCT and BrainMRI demonstrate significant improvements over state-of-the-art methods, achieving up to an 11.8% gain in AUROC and a 3.4% increase in F1 max.

Technology Category

Application Category

📝 Abstract
Medical image anomaly detection faces unique challenges due to subtle, heterogeneous anomalies embedded in complex anatomical structures. Through systematic Grad-CAM analysis, we reveal that discriminative activation maps fail on medical data, unlike their success on industrial datasets, motivating the need for manifold-level modeling. We propose PDD (Manifold-Prior Diverse Distillation), a framework that unifies dual-teacher priors into a shared high-dimensional manifold and distills this knowledge into dual students with complementary behaviors. Specifically, frozen VMamba-Tiny and wide-ResNet50 encoders provide global contextual and local structural priors, respectively. Their features are unified through a Manifold Matching and Unification (MMU) module, while an Inter-Level Feature Adaption (InA) module enriches intermediate representations. The unified manifold is distilled into two students: one performs layer-wise distillation via InA for local consistency, while the other receives skip-projected representations through a Manifold Prior Affine (MPA) module to capture cross-layer dependencies. A diversity loss prevents representation collapse while maintaining detection sensitivity. Extensive experiments on multiple medical datasets demonstrate that PDD significantly outperforms existing state-of-the-art methods, achieving improvements of up to 11.8%, 5.1%, and 8.5% in AUROC on HeadCT, BrainMRI, and ZhangLab datasets, respectively, and 3.4% in F1 max on the Uni-Medical dataset, establishing new state-of-the-art performance in medical image anomaly detection. The implementation will be released at https://github.com/OxygenLu/PDD
Problem

Research questions and friction points this paper is trying to address.

medical anomaly detection
subtle anomalies
heterogeneous anomalies
complex anatomical structures
discriminative activation maps
Innovation

Methods, ideas, or system contributions that make the work stand out.

manifold distillation
dual-teacher framework
medical anomaly detection
feature unification
diversity loss
🔎 Similar Papers
No similar papers found.