Riemannian Geometry Speaks Louder Than Words: From Graph Foundation Model to Next-Generation Graph Intelligence

📅 2026-03-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing graph neural networks in multi-domain pretraining, particularly their weak memory retention, insufficient interpretability, and difficulty in effective serialization for integration with large language models. To overcome these challenges, the paper proposes the Riemannian Foundation Model (RFM), which, for the first time, adopts Riemannian geometry as the core framework for graph foundation models. By explicitly modeling the intrinsic geometric structure of graphs, RFM enables structure-aware reasoning and generation. The approach synergistically integrates Riemannian geometry, graph semantic learning, and large language model architectures, transcending conventional paradigms of representation space transformation. This enables cross-domain generalization and unified graph understanding, thereby advancing the shift from specialized graph models toward general-purpose graph intelligence agents.

Technology Category

Application Category

📝 Abstract
Graphs provide a natural description of the complex relationships among objects, and play a pivotal role in communications, transportation, social computing, the life sciences, etc. Currently, there is strong agreement that Graph Foundation Models (GFMs) are essential for advancing graph learning, yet considerable disagreement persists on how to build a powerful, general-purpose GFM analogous to Large Language Models (LLMs). Graph Neural Networks (GNNs) exhibit limitations in memory retention and principled interpretability when confronted with multi-domain pretraining and adaptation. The challenge of graph serialization hinders the direct application of LLMs, as the words struggle to capture the structural complexity and diversity inherent in graphs. In contrast, Riemannian geometry offers an elegant mathematical framework for modeling structures, while remaining compatible with graph semantic learning, even with LLMs. In this paper, we argue that, for graphs, Riemannian geometry speaks louder than words, and lay out the foundational principles for GFM. Reimagining with Riemannian geometry, we introduce a blue sky idea-Riemannian Foundation Model (RFM)-that opens a new pathway for capturing complex structural patterns and uncovering cross-domain generalities. RFM emphasizes intrinsic graph geometry and embodies endogenous capacities for structural inference and generation, moving beyond mere representation-space switching. Accordingly, we outline a progressive agenda that begins with universal structural understanding through intrinsic geometry, and then rebuilds LLM with a Riemannian engine for general-purpose graph modeling and beyond. Thus, RFM enables a paradigm shift from designing graph models to solving graph-structured applications with RFM agents, unlocking the next-generation graph intelligence.
Problem

Research questions and friction points this paper is trying to address.

Graph Foundation Model
Riemannian geometry
graph serialization
structural complexity
general-purpose graph modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Riemannian Geometry
Graph Foundation Model
Structural Inference
Intrinsic Geometry
Graph Intelligence