Mitigating the Likelihood Paradox in Flow-based OOD Detection via Entropy Manipulation

📅 2026-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the well-known failure of flow-based generative models in out-of-distribution (OOD) detection, where such models often assign spuriously high likelihoods to OOD samples. The authors propose a novel method that enhances OOD detection without retraining the underlying density model. By measuring semantic dissimilarity between inputs and an in-distribution memory bank, the approach adaptively manipulates input entropy—applying stronger perturbations to samples exhibiting greater dissimilarity—to effectively widen the log-likelihood gap between in-distribution and OOD data. Theoretical analysis supports the validity of this entropy manipulation mechanism. Extensive experiments demonstrate that the proposed method substantially outperforms existing likelihood-based baselines on standard OOD benchmarks, achieving significant improvements in AUROC scores.

Technology Category

Application Category

📝 Abstract
Deep generative models that can tractably compute input likelihoods, including normalizing flows, often assign unexpectedly high likelihoods to out-of-distribution (OOD) inputs. We mitigate this likelihood paradox by manipulating input entropy based on semantic similarity, applying stronger perturbations to inputs that are less similar to an in-distribution memory bank. We provide a theoretical analysis showing that entropy control increases the expected log-likelihood gap between in-distribution and OOD samples in favor of the in-distribution, and we explain why the procedure works without any additional training of the density model. We then evaluate our method against likelihood-based OOD detectors on standard benchmarks and find consistent AUROC improvements over baselines, supporting our explanation.
Problem

Research questions and friction points this paper is trying to address.

likelihood paradox
out-of-distribution detection
normalizing flows
deep generative models
input likelihood
Innovation

Methods, ideas, or system contributions that make the work stand out.

likelihood paradox
entropy manipulation
out-of-distribution detection
normalizing flows
semantic similarity
🔎 Similar Papers
No similar papers found.