🤖 AI Summary
Existing graph distillation methods struggle to capture the geometric properties (e.g., hierarchical/tree-like structure) and dynamic information flow of real-world networks, resulting in poor task performance on simplified graphs and weak continual learning capability. To address this, we propose the first graph distillation framework integrating hyperbolic embedding with optimized random walks: hyperbolic space inherently models tree-like geometry, while spectral gap regularization jointly preserves geometric fidelity and dynamical characteristics. We further introduce distribution alignment and a graph continual learning adaptation mechanism to enhance robustness and improve the privacy–utility trade-off. Our method achieves state-of-the-art performance across node classification, link prediction, and continual graph learning benchmarks—demonstrating superior accuracy, strong noise resilience, and consistent gains over prior art.
📝 Abstract
Graph distillation (GD) is an effective approach to extract useful information from large-scale network structures. However, existing methods, which operate in Euclidean space to generate condensed graphs, struggle to capture the inherent tree-like geometry of real-world networks, resulting in distilled graphs with limited task-specific information for downstream tasks. Furthermore, these methods often fail to extract dynamic properties from graphs, which are crucial for understanding information flow and facilitating graph continual learning. This paper presents the Hyperbolic Graph Distillation with Random Walks Optimization (HyDRO), a novel graph distillation approach that leverages hyperbolic embeddings to capture complex geometric patterns and optimize the spectral gap in hyperbolic space. Experiments show that HyDRO demonstrates strong task generalization, consistently outperforming state-of-the-art methods in both node classification and link prediction tasks. HyDRO also effectively preserves graph random walk properties, producing condensed graphs that achieve enhanced performance in continual graph learning. Additionally, HyDRO achieves competitive results on mainstream graph distillation benchmarks, while maintaining a strong balance between privacy and utility, and exhibiting robust resistance to noises.