🤖 AI Summary
Knowledge graph completion (KGC) faces the challenge of achieving high-accuracy link prediction using only local subgraph information, without relying on global graph structure or extensive labeled data. Method: This paper proposes a generative subgraph modeling framework that introduces, for the first time, a subgraph-level prompt generation mechanism—integrating structural-aware encoding with semantic generation from large language models (LLMs) without fine-tuning. The approach supports zero-shot and few-shot KGC. Contributions/Results: Key innovations include structural-aware subgraph sampling and encoding, joint structure-text representation learning, and generative triple decoding. Evaluated on standard benchmarks (FB15k-237, WN18RR), the method significantly outperforms conventional embedding-based models and LLM fine-tuning approaches, achieves a 3× speedup in inference, and demonstrates strong generalization across unseen relations and sparse entities.