🤖 AI Summary
This work addresses the challenge of test-time out-of-distribution (OOD) detection—where anomalous samples significantly deviate from the training distribution and model fine-tuning is prohibited—by proposing a fine-tuning-free, test-time dynamic dictionary mechanism. Methodologically, it introduces: (1) adaptive feature accumulation and priority-queue-based dictionary updating guided by in-distribution (ID) sample informativeness; (2) ID-guided pseudo-OOD generation with a two-stage stabilization strategy to enhance dictionary robustness; and (3) optimized KNN distance estimation with dynamic threshold calibration. Evaluated on the OpenOOD benchmark, our approach reduces FPR95 by 26.0% for far-domain OOD detection on CIFAR-100, while a KNN variant achieves 3× speedup without performance degradation. To the best of our knowledge, this is the first method enabling test-time dynamic construction and continuous evolution of an OOD feature dictionary—achieving a favorable trade-off among detection accuracy, computational efficiency, and deployment practicality.
📝 Abstract
Out-of-distribution (OOD) detection remains challenging for deep learning models, particularly when test-time OOD samples differ significantly from training outliers. We propose OODD, a novel test-time OOD detection method that dynamically maintains and updates an OOD dictionary without fine-tuning. Our approach leverages a priority queue-based dictionary that accumulates representative OOD features during testing, combined with an informative inlier sampling strategy for in-distribution (ID) samples. To ensure stable performance during early testing, we propose a dual OOD stabilization mechanism that leverages strategically generated outliers derived from ID data. To our best knowledge, extensive experiments on the OpenOOD benchmark demonstrate that OODD significantly outperforms existing methods, achieving a 26.0% improvement in FPR95 on CIFAR-100 Far OOD detection compared to the state-of-the-art approach. Furthermore, we present an optimized variant of the KNN-based OOD detection framework that achieves a 3x speedup while maintaining detection performance.