🤖 AI Summary
This work addresses the challenge of out-of-distribution (OOD) detection under heterogeneous inter-class covariance structures and extreme per-class sample scarcity. We propose a novel Bayesian nonparametric modeling paradigm tailored to this setting. First, we theoretically reveal the Bayesian nonparametric nature of the relative Mahalanobis distance (RMDS). Building upon this insight, we develop a hierarchical Dirichlet process-based nonparametric mixture model that adaptively captures inter-class covariance heterogeneity and enables principled generalization of RMDS. Evaluated on the OpenOOD benchmark, our method significantly outperforms existing OOD detectors—particularly under strong covariance heterogeneity and ultra-sparse regimes (e.g., <10 samples per class). These results demonstrate superior robustness and generalization in small-sample, structurally complex OOD scenarios.
📝 Abstract
Bayesian nonparametric methods are naturally suited to the problem of out-of-distribution (OOD) detection. However, these techniques have largely been eschewed in favor of simpler methods based on distances between pre-trained or learned embeddings of data points. Here we show a formal relationship between Bayesian nonparametric models and the relative Mahalanobis distance score (RMDS), a commonly used method for OOD detection. Building on this connection, we propose Bayesian nonparametric mixture models with hierarchical priors that generalize the RMDS. We evaluate these models on the OpenOOD detection benchmark and show that Bayesian nonparametric methods can improve upon existing OOD methods, especially in regimes where training classes differ in their covariance structure and where there are relatively few data points per class.