🤖 AI Summary
This work proposes a novel approach to time series anomaly detection that addresses the limitations of traditional observation-likelihood-based methods, which often fail to capture structured temporal dynamics and misclassify anomalies as normal patterns. By introducing inductive biases into the latent space of conditional normalizing flows, the method models time series as discrete-time state-space systems, enforcing latent trajectories to conform to prescribed dynamical laws. Anomalies are then defined as deviations from these expected dynamics. The approach frames anomaly detection as a goodness-of-fit test for dynamic consistency—a formulation introduced here for the first time—and evaluates compliance of latent trajectories accordingly. Experiments on both synthetic and real-world datasets demonstrate its effectiveness in detecting anomalies in frequency, amplitude, and noise characteristics, achieving high detection performance alongside strong interpretability.
📝 Abstract
Deep generative models for anomaly detection in multivariate time-series are typically trained by maximizing data likelihood. However, likelihood in observation space measures marginal density rather than conformity to structured temporal dynamics, and therefore can assign high probability to anomalous or out-of-distribution samples. We address this structural limitation by relocating the notion of anomaly to a prescribed latent space. We introduce explicit inductive biases in conditional normalizing flows, modeling time-series observations within a discrete-time state-space framework that constrains latent representations to evolve according to prescribed temporal dynamics. Under this formulation, expected behavior corresponds to compliance with a specified distribution over latent trajectories, while anomalies are defined as violations of these dynamics. Anomaly detection is consequently reduced to a statistically grounded compliance test, such that observations are mapped to latent space and evaluated via goodness-of-fit tests against the prescribed latent evolution. This yields a principled decision rule that remains effective even in regions of high observation likelihood. Experiments on synthetic and real-world time-series demonstrate reliable detection of anomalies in frequency, amplitude, and observation noise, while providing interpretable diagnostics of model compliance.