🤖 AI Summary
For knowledge graph completion via link prediction, this paper proposes a relation-aware anchor entity mechanism that models the relational neighborhood of the head entity as dynamic context to enhance query representation. Methodologically, it introduces, for the first time, a pre-trained language model–based framework integrating relational neighborhood sampling, anchor entity generation, and contrastive embedding alignment—enabling fine-grained alignment between query embeddings and relation-aware context. Key contributions include: (1) explicit modeling of relation-path-guided neighbor structures to improve contextual discriminability; and (2) plug-and-play integration—requiring no modification to backbone architectures—while consistently boosting performance of existing methods. The approach achieves state-of-the-art results on standard benchmarks including FB15k-237 and WN18RR, significantly outperforming established baselines.
📝 Abstract
Text-based knowledge graph completion methods take advantage of pre-trained language models (PLM) to enhance intrinsic semantic connections of raw triplets with detailed text descriptions. Typical methods in this branch map an input query (textual descriptions associated with an entity and a relation) and its candidate entities into feature vectors, respectively, and then maximize the probability of valid triples. These methods are gaining promising performance and increasing attention for the rapid development of large language models. According to the property of the language models, the more related and specific context information the input query provides, the more discriminative the resultant embedding will be. In this paper, through observation and validation, we find a neglected fact that the relation-aware neighbors of the head entities in queries could act as effective contexts for more precise link prediction. Driven by this finding, we propose a relation-aware anchor enhanced knowledge graph completion method (RAA-KGC). Specifically, in our method, to provide a reference of what might the target entity be like, we first generate anchor entities within the relation-aware neighborhood of the head entity. Then, by pulling the query embedding towards the neighborhoods of the anchors, it is tuned to be more discriminative for target entity matching. The results of our extensive experiments not only validate the efficacy of RAA-KGC but also reveal that by integrating our relation-aware anchor enhancement strategy, the performance of current leading methods can be notably enhanced without substantial modifications.