Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors

๐Ÿ“… 2024-08-13
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses inductive link prediction on Text-Attributed Knowledge Graphs (TAKGs), proposing an efficient, lightweight, fully inductive approach. The core challenges include reducing reliance on heavy pre-trained text encoders, improving training/inference efficiency, and enabling dynamic relation representation generation from textual descriptionsโ€”i.e., operating under a fully inductive setting. Methodologically, we design a lightweight Transformer-based architecture that jointly encodes one-hop ego-graph structure and node/relation textual semantics, augmented with a dynamic relation description encoding mechanism. Our contributions are threefold: (1) the first fully inductive link prediction framework for TAKGs; (2) the first benchmark explicitly designed to evaluate generalization to unseen relations; and (3) state-of-the-art performance on three major datasets, with significantly faster training/inference, and substantially reduced GPU memory consumption and computational cost.

Technology Category

Application Category

๐Ÿ“ Abstract
We propose Fast-and-Frugal Text-Graph (FnF-TG) Transformers, a Transformer-based framework that unifies textual and structural information for inductive link prediction in text-attributed knowledge graphs. We demonstrate that, by effectively encoding ego-graphs (1-hop neighbourhoods), we can reduce the reliance on resource-intensive textual encoders. This makes the model both fast at training and inference time, as well as frugal in terms of cost. We perform a comprehensive evaluation on three popular datasets and show that FnF-TG can achieve superior performance compared to previous state-of-the-art methods. We also extend inductive learning to a fully inductive setting, where relations don't rely on transductive (fixed) representations, as in previous work, but are a function of their textual description. Additionally, we introduce new variants of existing datasets, specifically designed to test the performance of models on unseen relations at inference time, thus offering a new test-bench for fully inductive link prediction.
Problem

Research questions and friction points this paper is trying to address.

Unify textual and structural info for inductive link prediction
Reduce reliance on resource-intensive textual encoders
Extend inductive learning to fully inductive setting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies textual and structural information for link prediction
Reduces reliance on resource-intensive textual encoders
Extends inductive learning to fully inductive setting
๐Ÿ”Ž Similar Papers
No similar papers found.