EOOD: Entropy-based Out-of-distribution Detection

πŸ“… 2025-04-04
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Deep neural networks (DNNs) often exhibit overconfidence on out-of-distribution (OOD) inputs, severely compromising reliability in real-world deployment. To address this, we propose a label-free, fine-tuning-free OOD detection method. Our approach first synthesizes controllable pseudo-OOD samples to identify the network module most sensitive to ID/OOD information flow discrepancies. We then compute the conditional entropy of that module’s output as an unsupervised OOD confidence score. The method integrates module-level information flow analysis, multi-layer feature response comparison, and targeted pseudo-OOD generation to substantially enhance discriminative power. Evaluated across multiple standard in-distribution/out-of-distribution benchmarks, our method consistently outperforms existing state-of-the-art approaches, achieving an average 12.3% reduction in false positive rate at 95% true positive rate (FPR95). Results demonstrate its effectiveness, broad applicability across architectures and datasets, and plug-and-play compatibility.

Technology Category

Application Category

πŸ“ Abstract
Deep neural networks (DNNs) often exhibit overconfidence when encountering out-of-distribution (OOD) samples, posing significant challenges for deployment. Since DNNs are trained on in-distribution (ID) datasets, the information flow of ID samples through DNNs inevitably differs from that of OOD samples. In this paper, we propose an Entropy-based Out-Of-distribution Detection (EOOD) framework. EOOD first identifies specific block where the information flow differences between ID and OOD samples are more pronounced, using both ID and pseudo-OOD samples. It then calculates the conditional entropy on the selected block as the OOD confidence score. Comprehensive experiments conducted across various ID and OOD settings demonstrate the effectiveness of EOOD in OOD detection and its superiority over state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Detects out-of-distribution samples in deep neural networks
Uses entropy to measure information flow differences
Improves overconfidence issues in OOD detection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Entropy-based OOD detection framework
Identifies blocks with pronounced flow differences
Calculates conditional entropy as confidence score
πŸ”Ž Similar Papers
No similar papers found.
G
Guide Yang
Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou, China; Huangpu Research School, Guangzhou University, Guangzhou, China
Chao Hou
Chao Hou
Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou, China; Huangpu Research School, Guangzhou University, Guangzhou, China
W
Weilong Peng
School of Computer Science and Cyber Engineering, Guangzhou University, Guangzhou, China
X
Xiang Fang
IGP-ERI@N, Nanyang Technological University, Singapore, Singapore
Yongwei Nie
Yongwei Nie
South China University of Technology
Computer GraphicsComputer Vision
P
Peican Zhu
School of Artificial Intelligence, OPtics and ElectroNics (iOPEN), Northwestern Polytechnical University, Xi’an, China
Keke Tang
Keke Tang
Full Professor of Cybersecurity, Guangzhou University (always open to cooperation)
AI security3D visioncomputer graphicsrobotics