🤖 AI Summary
This paper addresses unsupervised sequence-level anomaly detection for multivariate time series. To overcome the limitation of conventional methods that model subsequences in isolation, we propose a Global Dictionary-Enhanced Transformer framework. Our key contributions are: (1) a novel global dictionary-driven cross-attention mechanism that explicitly models the association weights between each timestamp and globally learned normal patterns; (2) a unified detection criterion based on representation similarity, coupled with prototype learning to characterize the distribution of normality-related weights; and (3) a transferable global dictionary designed for cross-dataset generalization. Evaluated on five real-world benchmarks, our method achieves state-of-the-art performance. Ablation and dictionary-transfer experiments demonstrate its strong generalizability across datasets. The source code is publicly available.
📝 Abstract
Unsupervised anomaly detection of multivariate time series is a challenging task, given the requirements of deriving a compact detection criterion without accessing the anomaly points. The existing methods are mainly based on reconstruction error or association divergence, which are both confined to isolated subsequences with limited horizons, hardly promising unified series-level criterion. In this paper, we propose the Global Dictionary-enhanced Transformer (GDformer) with a renovated dictionary-based cross attention mechanism to cultivate the global representations shared by all normal points in the entire series. Accordingly, the cross-attention maps reflect the correlation weights between the point and global representations, which naturally leads to the representation-wise similarity-based detection criterion. To foster more compact detection boundary, prototypes are introduced to capture the distribution of normal point-global correlation weights. GDformer consistently achieves state-of-the-art unsupervised anomaly detection performance on five real-world benchmark datasets. Further experiments validate the global dictionary has great transferability among various datasets. The code is available at https://github.com/yuppielqx/GDformer.