HybEA: Hybrid Attention Models for Entity Alignment

📅 2024-07-03
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Entity alignment (EA) across heterogeneous knowledge graphs (e.g., DBpedia, Wikidata) faces dual challenges: structural heterogeneity—arising from divergent neighborhood topologies—and semantic heterogeneity—stemming from multi-source heterogeneous information such as names and literal attributes. Existing methods struggle to jointly model these two aspects. To address this, we propose a dual-path attentional graph neural network framework that separately encodes structural neighborhoods and factual features (e.g., entity names and literals), and employs adaptive attention mechanisms to enable interactive fusion of multi-source features. Additionally, contrastive learning is integrated to refine the embedding space. Our open-source implementation is fully reproducible. Extensive experiments on five mainstream benchmarks demonstrate consistent improvements: average Hits@1 increases by 11.3%, with a maximum gain of 20.7%, significantly outperforming state-of-the-art methods and markedly enhancing robustness for aligning entities across structurally and semantically heterogeneous knowledge graphs.

Technology Category

Application Category

📝 Abstract
The proliferation of Knowledge Graphs (KGs) that support a wide variety of applications, like entity search, question answering and recommender systems, has led to the need for identifying overlapping information among different KGs. Entity Alignment (EA) is the problem of detecting such overlapping information among KGs that refer to the same real-world entities. Recent works have shown a great potential in exploiting KG embeddings for the task of EA, with most works focusing on the structural representation of entities (i.e., entity neighborhoods) in a KG and some works also exploiting the available factual information of entities (e.g., their names and associated literal values). However, real-word KGs exhibit high levels of structural and semantic heterogeneity, making EA a challenging task in which most existing methods struggle to achieve good results. In this work, we propose HybEA, an open-source EA method that focuses on both structure and facts, using two separate attention-based models. Our experimental results show that HybEA outperforms state-of-the-art methods by at least 5% and as much as 20+% (with an average difference of 11+%) Hits@1, in 5 widely used benchmark datasets.
Problem

Research questions and friction points this paper is trying to address.

Address structural and semantic heterogeneity in Knowledge Graphs
Improve entity alignment across diverse domains and languages
Overcome limitations of one-to-one entity matching assumptions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid model combining structural and semantic features
Attention-based factual model co-trained with structural model
Outperforms state-of-the-art in diverse datasets
🔎 Similar Papers
No similar papers found.
N
N. Fanourakis
FORTH-ICS, Greece
F
Fatia Lekbour
ETIS, CYU University, France
Vasilis Efthymiou
Vasilis Efthymiou
Harokopio University of Athens
Knowledge ManagementBig DataEntity ResolutionSemantic WebAI
Guillaume Renton
Guillaume Renton
ETIS, ENSEA, France
V
V. Christophides
ETIS, ENSEA, France