DACAD: Domain Adaptation Contrastive Learning for Anomaly Detection in Multivariate Time Series

📅 2024-04-17
🏛️ IEEE Transactions on Knowledge and Data Engineering
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the failure of unsupervised domain adaptation (UDA) methods caused by inconsistent anomaly categories across domains, this paper proposes a cross-domain anomaly detection framework for label-scarce multivariate time series. Methodologically: (1) we introduce a novel controllable anomaly injection mechanism to enhance generalization to unseen anomaly classes; (2) we jointly optimize a supervised contrastive loss on the source domain and a self-supervised triplet contrastive loss on the target domain to alleviate the category inconsistency assumption; (3) we propose a Center-Entropy Classifier (CEC) to precisely model the decision boundary of normal patterns. Extensive experiments on multiple real-world and synthetic datasets demonstrate that our method significantly outperforms state-of-the-art UDA and time-series anomaly detection (TSAD) approaches. It achieves effective knowledge transfer under label scarcity, delivering both high accuracy and strong robustness.

Technology Category

Application Category

📝 Abstract
In time series anomaly detection (TSAD), the scarcity of labeled data poses a challenge to the development of accurate models. Unsupervised domain adaptation (UDA) offers a solution by leveraging labeled data from a related domain to detect anomalies in an unlabeled target domain. However, existing UDA methods assume consistent anomalous classes across domains. To address this limitation, we propose a novel Domain Adaptation Contrastive learning model for Anomaly Detection in multivariate time series (DACAD), combining UDA with contrastive learning. DACAD utilizes an anomaly injection mechanism that enhances generalization across unseen anomalous classes, improving adaptability and robustness. Additionally, our model employs supervised contrastive loss for the source domain and self-supervised contrastive triplet loss for the target domain, ensuring comprehensive feature representation learning and domain-invariant feature extraction. Finally, an effective Center-based Entropy Classifier (CEC) accurately learns normal boundaries in the source domain. Extensive evaluations on multiple real-world datasets and a synthetic dataset highlight DACAD's superior performance in transferring knowledge across domains and mitigating the challenge of limited labeled data in TSAD.
Problem

Research questions and friction points this paper is trying to address.

Addresses scarcity of labeled data in time series anomaly detection
Handles inconsistent anomalous classes across domains in adaptation
Improves generalization for unseen anomalies via contrastive learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines UDA with contrastive learning
Uses anomaly injection for generalization
Employs supervised and self-supervised contrastive losses
Z
Zahra Zamanzadeh Darban
Monash University, Melbourne
G
Geoffrey I. Webb
Monash University, Melbourne
Mahsa Salehi
Mahsa Salehi
Senior Lecturer, Monash University
Anomaly DetectionTime Series AnalysisMachine LearningBrain EEG Analysis