🤖 AI Summary
This work addresses the asymptotic behavior of sparse random graphs under message-passing graph neural networks (MPNNs). To characterize local convergence, we introduce *color convergence*—a novel criterion grounded in the Weisfeiler–Leman (WL) coloring algorithm. Building upon this, we propose the Refined Configuration Model (RCM), a unified framework capturing the local limit structures of diverse sparse random graph families, including Erdős–Rényi, classical configuration, and random regular graphs. Theoretically, we establish a rigorous analytical framework integrating local convergence, WL coloring, and stochastic graph modeling. Structurally, we fully characterize the random trees arising as local limits of sparse graphs and prove that the RCM serves as a universal representation for tree-like local limits. Collectively, our results provide a foundational theory for analyzing the expressive power and asymptotic behavior of MPNNs on sparse graphs.
📝 Abstract
Local convergence has emerged as a fundamental tool for analyzing sparse random graph models. We introduce a new notion of local convergence, color convergence, based on the Weisfeiler-Leman algorithm. Color convergence fully characterizes the class of random graphs that are well-behaved in the limit for message-passing graph neural networks. Building on this, we propose the Refined Configuration Model (RCM), a random graph model that generalizes the configuration model. The RCM is universal with respect to local convergence among locally tree-like random graph models, including Erdős-Rényi, stochastic block and configuration models. Finally, this framework enables a complete characterization of the random trees that arise as local limits of such graphs.