Geometric Structural Knowledge Graph Foundation Model

📅 2025-12-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing structural knowledge graph foundation models (e.g., Ultra) rely on single-relation transformations—such as element-wise multiplication—limiting expressiveness and generalization to unseen graph structures. Method: We propose a novel foundation model for zero-shot inductive link prediction, introducing the first multi-head geometric attention mechanism that concurrently models relation transformations in real, complex, split-complex, and dual number spaces. A lightweight entropy-regularized gating mechanism enables triplet-level adaptive fusion. Our approach integrates algebraic message passing, multi-geometric space embedding, and relation-conditioned attention. Contribution/Results: Evaluated across 56 heterogeneous knowledge graphs, our model achieves a 5.5% absolute improvement in MRR over Ultra and a 4.4% average gain across all benchmarks, empirically validating the complementarity and generalization benefits of multi-geometric representation.

Technology Category

Application Category

📝 Abstract
Structural knowledge graph foundation models aim to generalize reasoning to completely new graphs with unseen entities and relations. A key limitation of existing approaches like Ultra is their reliance on a single relational transformation (e.g., element-wise multiplication) in message passing, which can constrain expressiveness and fail to capture diverse relational and structural patterns exhibited on diverse graphs. In this paper, we propose Gamma, a novel foundation model that introduces multi-head geometric attention to knowledge graph reasoning. Gamma replaces the single relational transformation with multiple parallel ones, including real, complex, split-complex, and dual number based transformations, each designed to model different relational structures. A relational conditioned attention fusion mechanism then adaptively fuses them at link level via a lightweight gating with entropy regularization, allowing the model to robustly emphasize the most appropriate relational bias for each triple pattern. We present a full formalization of these algebraic message functions and discuss how their combination increases expressiveness beyond any single space. Comprehensive experiments on 56 diverse knowledge graphs demonstrate that Gamma consistently outperforms Ultra in zero-shot inductive link prediction, with a 5.5% improvement in mean reciprocal rank on the inductive benchmarks and a 4.4% improvement across all benchmarks, highlighting benefits from complementary geometric representations.
Problem

Research questions and friction points this paper is trying to address.

Generalize reasoning to unseen entities and relations
Overcome single relational transformation limitations in message passing
Enhance expressiveness with multi-head geometric attention for diverse graphs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-head geometric attention with diverse algebraic transformations
Relational conditioned attention fusion via lightweight gating
Complementary geometric representations enhance zero-shot inductive reasoning
🔎 Similar Papers
No similar papers found.