TSINR: Capturing Temporal Continuity via Implicit Neural Representations for Time Series Anomaly Detection

πŸ“… 2024-11-18
πŸ›οΈ Knowledge Discovery and Data Mining
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In time-series anomaly detection, reconstruction-based methods suffer from compromised normal pattern learning when training data contain unlabeled anomalies. To address this, we propose the first Implicit Neural Representation (INR)-based framework tailored for time-series anomaly detection. Leveraging INRs’ inherent spectral bias, our method enhances sensitivity to discontinuous anomalies; we design a continuous functional modeling architecture that jointly captures intra- and inter-channel dependencies; and we integrate large language models to strengthen semantic representation of anomalous fluctuations. The framework achieves synergistic optimization across three dimensions: frequency-domain modeling, continuous representation learning, and semantic enhancement. Extensive experiments on both multivariate and univariate benchmarks demonstrate that our approach consistently outperforms state-of-the-art reconstruction-based methods, achieving significant improvements in anomaly detection recall and localization accuracy.

Technology Category

Application Category

πŸ“ Abstract
Time series anomaly detection aims to identify unusual patterns in data or deviations from systems' expected behavior. The reconstruction-based methods are the mainstream in this task, which learn point-wise representation via unsupervised learning. However, the unlabeled anomaly points in training data may cause these reconstruction-based methods to learn and reconstruct anomalous data, resulting in the challenge of capturing normal patterns. In this paper, we propose a time series anomaly detection method based on implicit neural representation (INR) reconstruction, named TSINR, to address this challenge. Due to the property of spectral bias, TSINR enables prioritizing low-frequency signals and exhibiting poorer performance on high-frequency abnormal data. Specifically, we adopt INR to parameterize time series data as a continuous function and employ a transformer-based architecture to predict the INR of given data. As a result, the proposed TSINR method achieves the advantage of capturing the temporal continuity and thus is more sensitive to discontinuous anomaly data. In addition, we further design a novel form of INR continuous function to learn inter- and intra-channel information, and leverage a pre-trained large language model to amplify the intense fluctuations in anomalies. Extensive experiments demonstrate that TSINR achieves superior overall performance on both univariate and multivariate time series anomaly detection benchmarks compared to other state-of-the-art reconstruction-based methods. Our codes are available here.
Problem

Research questions and friction points this paper is trying to address.

Detects anomalies in time series by learning normal patterns via implicit neural representations.
Addresses reconstruction bias from unlabeled anomalies by prioritizing low-frequency signals.
Improves sensitivity to discontinuous anomalies using temporal continuity and transformer-based architecture.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses implicit neural representation for time series
Leverages transformer-based architecture for prediction
Incorporates large language model for anomaly detection
πŸ”Ž Similar Papers
No similar papers found.
Mengxuan Li
Mengxuan Li
zhejiang university
K
Ke Liu
College of Computer Science, Zhejiang University
Hongyang Chen
Hongyang Chen
SUN YAT-SEN UNIVERSITY
SDNCloud ComputingMicroserviceAIOps
J
Jiajun Bu
College of Computer Science, Zhejiang University
H
Hongwei Wang
ZJU-UIUC Institute, Zhejiang University
Haishuai Wang
Haishuai Wang
Harvard University
Data MiningMachine Learning