🤖 AI Summary
To address label scarcity and poor representation transferability in cross-domain graph learning, this paper proposes LP-TGNN—a novel framework that unifies graph topology tensor modeling with cross-domain label propagation within a GNN architecture. It explicitly captures domain-invariant structural patterns and achieves semantic alignment across domains. Key components include a tensor-based graph encoder, consistency-driven cross-domain label propagation, pseudo-label-guided self-training, and domain discrepancy constraints via MMD and CDAN. LP-TGNN is plug-and-play, compatible with mainstream GNNs and domain adaptation methods. Extensive experiments on multi-source cross-domain graph benchmarks—including Amazon and COIL-100—demonstrate average accuracy improvements of 5.2%–9.7% over state-of-the-art baselines. Ablation studies confirm the substantial contribution of each module to overall performance.
📝 Abstract
Graph Neural Networks (GNNs) have recently become the predominant tools for studying graph data. Despite state-of-the-art performance on graph classification tasks, GNNs are overwhelmingly trained in a single domain under supervision, thus necessitating a prohibitively high demand for labels and resulting in poorly transferable representations. To address this challenge, we propose the Label-Propagation Tensor Graph Neural Network (LP-TGNN) framework to bridge the gap between graph data and traditional domain adaptation methods. It extracts graph topological information holistically with a tensor architecture and then reduces domain discrepancy through label propagation. It is readily compatible with general GNNs and domain adaptation techniques with minimal adjustment through pseudo-labeling. Experiments on various real-world benchmarks show that our LP-TGNN outperforms baselines by a notable margin. We also validate and analyze each component of the proposed framework in the ablation study.