Balanced Hyperbolic Embeddings Are Natural Out-of-Distribution Detectors

📅 2025-06-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses out-of-distribution (OOD) detection—a critical challenge in deep learning—by proposing Balanced Hyperbolic Embedding (BHE). We first reveal that well-structured hierarchical hyperbolic embeddings inherently facilitate OOD detection, and introduce hyperbolic class prototypes jointly optimized for hierarchical distortion minimization and sub-level width–shallowness balance. Leveraging hyperbolic distance metrics, a hyperbolic generalization of OOD scoring functions, and hierarchical prototype modeling, BHE enhances OOD discriminability without requiring additional training. Extensive experiments across 13 benchmark datasets and 13 OOD scoring functions demonstrate that BHE consistently outperforms existing methods—including state-of-the-art hyperbolic and contrastive learning approaches—achieving new SOTA performance.

Technology Category

Application Category

📝 Abstract
Out-of-distribution recognition forms an important and well-studied problem in deep learning, with the goal to filter out samples that do not belong to the distribution on which a network has been trained. The conclusion of this paper is simple: a good hierarchical hyperbolic embedding is preferred for discriminating in- and out-of-distribution samples. We introduce Balanced Hyperbolic Learning. We outline a hyperbolic class embedding algorithm that jointly optimizes for hierarchical distortion and balancing between shallow and wide subhierarchies. We then use the class embeddings as hyperbolic prototypes for classification on in-distribution data. We outline how to generalize existing out-of-distribution scoring functions to operate with hyperbolic prototypes. Empirical evaluations across 13 datasets and 13 scoring functions show that our hyperbolic embeddings outperform existing out-of-distribution approaches when trained on the same data with the same backbones. We also show that our hyperbolic embeddings outperform other hyperbolic approaches, beat state-of-the-art contrastive methods, and natively enable hierarchical out-of-distribution generalization.
Problem

Research questions and friction points this paper is trying to address.

Detect out-of-distribution samples in deep learning
Optimize hierarchical hyperbolic embeddings for classification
Improve out-of-distribution detection with hyperbolic prototypes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Balanced Hyperbolic Learning for embeddings
Hyperbolic class embedding algorithm
Generalized out-of-distribution scoring functions
🔎 Similar Papers