Beyond Marginals: Learning Joint Spatio-Temporal Patterns for Multivariate Anomaly Detection

📅 2025-09-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing multivariate time series anomaly detection methods often assume conditional independence among variables, neglecting time-varying, nonlinear spatiotemporal dependencies, and thus fail to identify “collective anomalies”—patterns where the system deviates anomalously as a whole despite individual variables appearing normal. To address this, we propose a decoupled modeling framework that separately learns temporal dynamics (via a Transformer encoder) and inter-variable spatial dependencies (via Copula-based multivariate likelihood modeling) in a shared latent space, jointly optimized through self-supervised contrastive learning. Our approach explicitly captures dynamic, nonlinear, and high-order dependencies without relying on strong independence assumptions. Evaluated on multiple real-world benchmarks, it achieves significant improvements over state-of-the-art methods—particularly in complex collective anomaly scenarios—demonstrating superior detection accuracy and robustness.

Technology Category

Application Category

📝 Abstract
In this paper, we aim to improve multivariate anomaly detection (AD) by modeling the extit{time-varying non-linear spatio-temporal correlations} found in multivariate time series data . In multivariate time series data, an anomaly may be indicated by the simultaneous deviation of interrelated time series from their expected collective behavior, even when no individual time series exhibits a clearly abnormal pattern on its own. In many existing approaches, time series variables are assumed to be (conditionally) independent, which oversimplifies real-world interactions. Our approach addresses this by modeling joint dependencies in the latent space and decoupling the modeling of extit{marginal distributions, temporal dynamics, and inter-variable dependencies}. We use a transformer encoder to capture temporal patterns, and to model spatial (inter-variable) dependencies, we fit a multi-variate likelihood and a copula. The temporal and the spatial components are trained jointly in a latent space using a self-supervised contrastive learning objective to learn meaningful feature representations to separate normal and anomaly samples.
Problem

Research questions and friction points this paper is trying to address.

Modeling time-varying non-linear spatio-temporal correlations in multivariate data
Detecting anomalies through simultaneous deviation of interrelated time series
Addressing oversimplified independence assumptions in existing anomaly detection approaches
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer encoder captures temporal patterns
Multivariate likelihood and copula model spatial dependencies
Self-supervised contrastive learning in latent space
🔎 Similar Papers
No similar papers found.