Graph Anomaly Detection via Adaptive Test-time Representation Learning across Out-of-Distribution Domains

📅 2025-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the failure of supervised methods in cross-domain graph anomaly detection (GAD) caused by distribution shift and feature heterogeneity, this paper proposes AdaGraph-T3—a novel framework enabling test-time domain adaptation solely via self-supervision. Its key contributions are: (1) a test-time adaptive mechanism grounded in node homophily affinity; (2) edge-attention-weighted message passing to enhance structural awareness; and (3) domain-specific encoders jointly optimized with class-aware regularization to improve cross-domain generalization and robustness to long-tailed anomalies. The framework integrates supervised or self-supervised pretraining with test-time self-supervised fine-tuning. Evaluated on multiple cross-domain benchmarks, AdaGraph-T3 achieves average improvements of 6.6% in AUROC and 7.9% in AUPRC over state-of-the-art methods, demonstrating superior performance and generalizability.

Technology Category

Application Category

📝 Abstract
Graph Anomaly Detection (GAD) has demonstrated great effectiveness in identifying unusual patterns within graph-structured data. However, while labeled anomalies are often scarce in emerging applications, existing supervised GAD approaches are either ineffective or not applicable when moved across graph domains due to distribution shifts and heterogeneous feature spaces. To address these challenges, we present AdaGraph-T3, a novel test-time training framework for cross-domain GAD. AdaGraph-T3 combines supervised and self-supervised learning during training while adapting to a new domain during test time using only self-supervised learning by leveraging a homophily-based affinity score that captures domain-invariant properties of anomalies. Our framework introduces four key innovations to cross-domain GAD: an effective self-supervision scheme, an attention-based mechanism that dynamically learns edge importance weights during message passing, domain-specific encoders for handling heterogeneous features, and class-aware regularization to address imbalance. Experiments across multiple cross-domain settings demonstrate that AdaGraph-T3 significantly outperforms existing approaches, achieving average improvements of over 6.6% in AUROC and 7.9% in AUPRC compared to the best competing model.
Problem

Research questions and friction points this paper is trying to address.

Detects anomalies in graph-structured data.
Adapts to new domains during test time.
Handles distribution shifts and heterogeneous features.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive test-time training framework
Self-supervised learning for domain adaptation
Attention-based edge importance mechanism
🔎 Similar Papers