🤖 AI Summary
Dynamic graphs—such as those modeling traffic or computer networks—pose significant challenges for anomaly detection due to high false positive rates and difficulties in jointly modeling temporal dependencies and structural dynamics. To address these issues, this paper proposes a novel framework integrating temporal modeling with Extreme Value Theory (EVT). First, multi-scale feature extraction and temporal modeling are applied to the graph sequence; then, residual analysis separates normal evolutionary patterns from anomalous signals. Crucially, EVT is systematically introduced—for the first time in this context—to robustly model the tail distribution of residuals, enabling statistically principled anomaly discrimination. The method requires no prior assumptions about anomaly patterns and effectively mitigates interference from temporal dependencies. Extensive experiments on multiple real-world dynamic graph datasets demonstrate that our approach outperforms state-of-the-art baselines—including TensorSplat and Laplacian—in accuracy, while reducing the average false positive rate by 32.7%.
📝 Abstract
Detecting anomalies in a temporal sequence of graphs can be applied is areas such as the detection of accidents in transport networks and cyber attacks in computer networks. Existing methods for detecting abnormal graphs can suffer from multiple limitations, such as high false positive rates as well as difficulties with handling variable-sized graphs and non-trivial temporal dynamics. To address this, we propose a technique where temporal dependencies are explicitly modelled via time series analysis of a large set of pertinent graph features, followed by using residuals to remove the dependencies. Extreme Value Theory is then used to robustly model and classify any remaining extremes, aiming to produce low false positives rates. Comparative evaluations on a multitude of graph instances show that the proposed approach obtains considerably better accuracy than TensorSplat and Laplacian Anomaly Detection.