🤖 AI Summary
Graph data often exhibit heterogeneous geometric structures—such as hierarchical tree-like patterns and dense community clusters—posing challenges for existing geometric graph neural networks (GNNs), which typically rely on single or piecewise fixed-curvature spaces and thus fail to capture local geometric diversity. To address this, we propose an adaptive Riemannian manifold learning framework that learns a continuous, anisotropic, differentiable metric tensor for each node, enabling autonomous adaptation to locally optimal curvature. We further introduce Ricci-flow-inspired curvature regularization to ensure geometric evolution convergence and unify theoretical foundations for both constant- and mixed-curvature GNNs. Our method employs covariant derivatives to achieve stable message passing and optimization on non-uniform manifolds. Extensive experiments demonstrate state-of-the-art performance on both homophilic and heterophilic graph benchmarks, while the learned geometric parameters remain interpretable—validating the theoretical soundness and practical efficacy of our approach.
📝 Abstract
Graph data often exhibits complex geometric heterogeneity, where structures with varying local curvature, such as tree-like hierarchies and dense communities, coexist within a single network. Existing geometric GNNs, which embed graphs into single fixed-curvature manifolds or discrete product spaces, struggle to capture this diversity. We introduce Adaptive Riemannian Graph Neural Networks (ARGNN), a novel framework that learns a continuous and anisotropic Riemannian metric tensor field over the graph. It allows each node to determine its optimal local geometry, enabling the model to fluidly adapt to the graph's structural landscape. Our core innovation is an efficient parameterization of the node-wise metric tensor, specializing to a learnable diagonal form that captures directional geometric information while maintaining computational tractability. To ensure geometric regularity and stable training, we integrate a Ricci flow-inspired regularization that smooths the learned manifold. Theoretically, we establish the rigorous geometric evolution convergence guarantee for ARGNN and provide a continuous generalization that unifies prior fixed or mixed-curvature GNNs. Empirically, our method demonstrates superior performance on both homophilic and heterophilic benchmark datasets with the ability to capture diverse structures adaptively. Moreover, the learned geometries both offer interpretable insights into the underlying graph structure and empirically corroborate our theoretical analysis.