🤖 AI Summary
To address the few-shot knowledge graph (KG) completion challenge posed by long-tailed relations, this paper proposes a generative modeling framework. First, it introduces a two-stage attention-based triplet augmenter that explicitly captures local neighborhood structure and semantic contrast signals. Second, it pioneers the integration of a U-shaped Kolmogorov–Arnold Network (U-KAN) into a denoising diffusion probabilistic model (DDPM), enabling joint modeling of topological structure and contrastive distribution. The method unifies graph neural networks with contrastive learning, eliminating alignment bias and distribution mismatch inherent in prior approaches. Evaluated on FB15k-237 and NELL-995 under the 5-shot setting, the method achieves a new state-of-the-art mean reciprocal rank (MRR), improving by 4.2% over the previous best, and significantly outperforms existing few-shot KG completion methods.
📝 Abstract
Knowledge Graphs (KGs), thanks to their concise and efficient triple-based structure, have been widely applied in intelligent question answering, recommender systems and other domains. However, the heterogeneous and multifaceted nature of real-world data inevitably renders the distribution of relations long-tailed, making it crucial to complete missing facts with limited samples. Previous studies mainly based on metric matching or meta learning, yet they either fail to fully exploit neighborhood information in graph or overlook the distributional characteristics of contrastive signals. In this paper, we re-examine the problem from a perspective of generative representation and propose a few-shot knowledge graph completion framework that integrates two-stage attention triple enhancer with U-KAN based diffusion model. Extensive experiments on two public datasets show that our method achieve new state-of-the-art results.