Hyperbolic Heterogeneous Graph Transformer

📅 2026-01-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing heterogeneous graph methods rely on tangent space operations, which often introduce mapping distortions and struggle to capture global hierarchical structures and long-range dependencies across node types. To address these limitations, this work proposes the first end-to-end fully hyperbolic heterogeneous graph Transformer, which performs message passing directly in hyperbolic space without resorting to tangent space mappings. The model incorporates a relation-aware hyperbolic attention mechanism with linear complexity, effectively integrating local and global dependencies while preserving the full heterogeneity of semantic relationships. Experimental results demonstrate that the proposed method significantly outperforms state-of-the-art models on node classification tasks, achieving higher accuracy alongside substantially reduced training time and memory consumption.

Technology Category

Application Category

📝 Abstract
In heterogeneous graphs, we can observe complex structures such as tree-like or hierarchical structures. Recently, the hyperbolic space has been widely adopted in many studies to effectively learn these complex structures. Although these methods have demonstrated the advantages of the hyperbolic space in learning heterogeneous graphs, most existing methods still have several challenges. They rely heavily on tangent-space operations, which often lead to mapping distortions during frequent transitions. Moreover, their message-passing architectures mainly focus on local neighborhood information, making it difficult to capture global hierarchical structures and long-range dependencies between different types of nodes. To address these limitations, we propose Hyperbolic Heterogeneous Graph Transformer (HypHGT), which effectively and efficiently learns heterogeneous graph representations entirely within the hyperbolic space. Unlike previous message-passing based hyperbolic heterogeneous GNNs, HypHGT naturally captures both local and global dependencies through transformer-based architecture. Furthermore, the proposed relation-specific hyperbolic attention mechanism in HypHGT, which operates with linear time complexity, enables efficient computation while preserving the heterogeneous information across different relation types. This design allows HypHGT to effectively capture the complex structural properties and semantic information inherent in heterogeneous graphs. We conduct comprehensive experiments to evaluate the effectiveness and efficiency of HypHGT, and the results demonstrate that it consistently outperforms state-of-the-art methods in node classification task, with significantly reduced training time and memory usage.
Problem

Research questions and friction points this paper is trying to address.

heterogeneous graph
hyperbolic space
message passing
hierarchical structure
long-range dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hyperbolic space
Heterogeneous graph
Graph Transformer
Relation-specific attention
Global hierarchical structure
🔎 Similar Papers
No similar papers found.