🤖 AI Summary
Existing knowledge graph embedding methods rely on predefined homogeneous manifolds—e.g., Euclidean, spherical, or hyperbolic spaces—rendering them inadequate for modeling heterogeneous geometric structures in real-world graphs, where local curvature varies sharply; this leads to distance distortion and limited representational capacity. To address this, we propose the first framework that dynamically couples embedding learning with local manifold curvature. Our approach introduces an extended Ricci flow mechanism, jointly optimizing geometric structure and entity representations via co-driven evolution: curvature evolves according to a differential equation while simultaneously adapting to gradients of the embedding loss. We theoretically prove exponential decay of edge curvature and convergence of embedding distances to the global optimum, enabling mutual reinforcement between geometric flattening and representation learning. Extensive experiments demonstrate significant improvements over state-of-the-art methods on link prediction and node classification, validating superior adaptability to heterogeneous graph topologies and enhanced geometric modeling capability.
📝 Abstract
Knowledge graph embedding (KGE) relies on the geometry of the embedding space to encode semantic and structural relations. Existing methods place all entities on one homogeneous manifold, Euclidean, spherical, hyperbolic, or their product/multi-curvature variants, to model linear, symmetric, or hierarchical patterns. Yet a predefined, homogeneous manifold cannot accommodate the sharply varying curvature that real-world graphs exhibit across local regions. Since this geometry is imposed a priori, any mismatch with the knowledge graph's local curvatures will distort distances between entities and hurt the expressiveness of the resulting KGE. To rectify this, we propose RicciKGE to have the KGE loss gradient coupled with local curvatures in an extended Ricci flow such that entity embeddings co-evolve dynamically with the underlying manifold geometry towards mutual adaptation. Theoretically, when the coupling coefficient is bounded and properly selected, we rigorously prove that i) all the edge-wise curvatures decay exponentially, meaning that the manifold is driven toward the Euclidean flatness; and ii) the KGE distances strictly converge to a global optimum, which indicates that geometric flattening and embedding optimization are promoting each other. Experimental improvements on link prediction and node classification benchmarks demonstrate RicciKGE's effectiveness in adapting to heterogeneous knowledge graph structures.