On Uniformly Scaling Flows: A Density-Aligned Approach to Deep One-Class Classification

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep one-class classification and density estimation paradigms in unsupervised anomaly detection remain theoretically disconnected. Method: We propose a unified framework based on Uniform Scaling Flows (USFs)—Jacobian-determinant-preserving invertible transformations—that theoretically bridge maximum-likelihood training and the Deep SVDD objective. USFs inherently integrate density modeling with distance-aware representation learning, preventing representation collapse and tightly aligning negative log-likelihood with latent-space norm penalties. Our approach builds upon normalizing flow architectures, incorporates one-class regularization, and is compatible with hybrid models (e.g., VAEs). Results: Extensive experiments across multiple benchmarks and backbone networks demonstrate significant improvements in both image-level and pixel-level anomaly detection performance, enhanced training stability, and consistent gains—validating USF as an effective, plug-and-play unified paradigm for unsupervised anomaly detection.

Technology Category

Application Category

📝 Abstract
Unsupervised anomaly detection is often framed around two widely studied paradigms. Deep one-class classification, exemplified by Deep SVDD, learns compact latent representations of normality, while density estimators realized by normalizing flows directly model the likelihood of nominal data. In this work, we show that uniformly scaling flows (USFs), normalizing flows with a constant Jacobian determinant, precisely connect these approaches. Specifically, we prove how training a USF via maximum-likelihood reduces to a Deep SVDD objective with a unique regularization that inherently prevents representational collapse. This theoretical bridge implies that USFs inherit both the density faithfulness of flows and the distance-based reasoning of one-class methods. We further demonstrate that USFs induce a tighter alignment between negative log-likelihood and latent norm than either Deep SVDD or non-USFs, and how recent hybrid approaches combining one-class objectives with VAEs can be naturally extended to USFs. Consequently, we advocate using USFs as a drop-in replacement for non-USFs in modern anomaly detection architectures. Empirically, this substitution yields consistent performance gains and substantially improved training stability across multiple benchmarks and model backbones for both image-level and pixel-level detection. These results unify two major anomaly detection paradigms, advancing both theoretical understanding and practical performance.
Problem

Research questions and friction points this paper is trying to address.

Bridging deep one-class classification and density estimation methods
Preventing representational collapse in unsupervised anomaly detection
Improving performance and stability across anomaly detection benchmarks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uniformly scaling flows connect one-class classification and density estimation
USFs prevent representational collapse through unique regularization
USFs replace non-uniform flows in anomaly detection architectures
🔎 Similar Papers
No similar papers found.
F
Faried Abu Zaid
Research Center Trustworthy Data Science and Security, TU Dortmund University
T
Tim Katzke
Research Center Trustworthy Data Science and Security, TU Dortmund University
Emmanuel Müller
Emmanuel Müller
Professor of Computer Science, Technical University of Dortmund
Data MiningMachine LearningData ExplorationDatabases
Daniel Neider
Daniel Neider
TU Dortmund University and Center for Trustworthy Data Science and Security
Formal MethodsMachine LearningLogicArtificial Intelligence