BootOOD: Self-Supervised Out-of-Distribution Detection via Synthetic Sample Exposure under Neural Collapse

📅 2025-11-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of detecting semantically similar out-of-distribution (OOD) samples in security-critical scenarios, this paper proposes BootOOD—a fully self-supervised framework. Methodologically, it exploits neural collapse to synthesize pseudo-OOD features solely from in-distribution (ID) data via lightweight transformations, and introduces a lightweight auxiliary head that performs radius-based classification on feature norms, thereby decoupling ID classification from OOD detection—without requiring any real OOD samples or anomaly exposure training. Its core innovation lies in being the first to establish feature norm disparity as a reliable discriminative signal for detecting semantically close OOD samples, synergistically combined with self-supervised pseudo-OOD generation for end-to-end optimization. On CIFAR-10/100 and ImageNet-200, BootOOD achieves OOD detection performance on par with state-of-the-art anomaly exposure methods, while maintaining or even improving ID classification accuracy.

Technology Category

Application Category

📝 Abstract
Out-of-distribution (OOD) detection is critical for deploying image classifiers in safety-sensitive environments, yet existing detectors often struggle when OOD samples are semantically similar to the in-distribution (ID) classes. We present BootOOD, a fully self-supervised OOD detection framework that bootstraps exclusively from ID data and is explicitly designed to handle semantically challenging OOD samples. BootOOD synthesizes pseudo-OOD features through simple transformations of ID representations and leverages Neural Collapse (NC), where ID features cluster tightly around class means with consistent feature norms. Unlike prior approaches that aim to constrain OOD features into subspaces orthogonal to the collapsed ID means, BootOOD introduces a lightweight auxiliary head that performs radius-based classification on feature norms. This design decouples OOD detection from the primary classifier and imposes a relaxed requirement: OOD samples are learned to have smaller feature norms than ID features, which is easier to satisfy when ID and OOD are semantically close. Experiments on CIFAR-10, CIFAR-100, and ImageNet-200 show that BootOOD outperforms prior post-hoc methods, surpasses training-based methods without outlier exposure, and is competitive with state-of-the-art outlier-exposure approaches while maintaining or improving ID accuracy.
Problem

Research questions and friction points this paper is trying to address.

Detecting semantically similar out-of-distribution samples for image classifiers
Developing self-supervised OOD detection using only in-distribution data
Handling challenging OOD samples through feature norm differentiation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Synthesizes pseudo-OOD features via ID transformations
Uses radius-based classification on feature norms
Decouples OOD detection from the primary classifier
🔎 Similar Papers
No similar papers found.
Y
Yuanchao Wang
NYU Shanghai Center for Data Science
T
Tian Qin
NYU Shanghai Center for Data Science
Eduardo Valle
Eduardo Valle
valeo.ai
Machine LearningComputer VisionHealthEducation
B
Bruno Abrahao
Leonard N. Stern School of Business, New York University