On the Convergence and Size Transferability of Continuous-depth Graph Neural Networks

📅 2025-10-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the convergence and size transferability of continuous-depth Graph Neural Differential Equations (GNDEs) as graph sizes tend to infinity. To bridge the gap in existing theoretical foundations, we propose the Graphon Neural Differential Equation (Graphon-NDE) as a unified modeling framework and establish, for the first time, a trajectory convergence theory for GNDEs under the graphon limit, proving its well-posedness. Leveraging tools from graph theory, graphon analysis, and dynamical systems, we derive explicit convergence rates and size-transfer error bounds under two sampling schemes. Our theory guarantees uniform convergence of GNDE solutions to their Graphon-NDE counterparts. Empirical results confirm that GNDEs maintain performance across graphs of varying scales without retraining, significantly enhancing deployment efficiency. The core contribution is the first rigorous continuous-depth graph learning theory with provable convergence guarantees and quantifiable size-transfer errors.

Technology Category

Application Category

📝 Abstract
Continuous-depth graph neural networks, also known as Graph Neural Differential Equations (GNDEs), combine the structural inductive bias of Graph Neural Networks (GNNs) with the continuous-depth architecture of Neural ODEs, offering a scalable and principled framework for modeling dynamics on graphs. In this paper, we present a rigorous convergence analysis of GNDEs with time-varying parameters in the infinite-node limit, providing theoretical insights into their size transferability. To this end, we introduce Graphon Neural Differential Equations (Graphon-NDEs) as the infinite-node limit of GNDEs and establish their well-posedness. Leveraging tools from graphon theory and dynamical systems, we prove the trajectory-wise convergence of GNDE solutions to Graphon-NDE solutions. Moreover, we derive explicit convergence rates under two deterministic graph sampling regimes: (1) weighted graphs sampled from smooth graphons, and (2) unweighted graphs sampled from ${0,1}$-valued (discontinuous) graphons. We further establish size transferability bounds, providing theoretical justification for the practical strategy of transferring GNDE models trained on moderate-sized graphs to larger, structurally similar graphs without retraining. Numerical experiments using synthetic and real data support our theoretical findings.
Problem

Research questions and friction points this paper is trying to address.

Analyzing convergence of continuous-depth graph neural networks in infinite-node limit
Establishing size transferability bounds for graph neural differential equations
Proving convergence rates under deterministic graph sampling regimes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Continuous-depth GNNs combined with Neural ODEs
Graphon-NDEs as infinite-node limit for analysis
Size transferability bounds for scaling without retraining
🔎 Similar Papers
2024-02-16Nature CommunicationsCitations: 2
M
Mingsong Yan
Department of Mathematics, University of California, Santa Barbara, CA
C
Charles Kulick
Department of Mathematics, University of California, Santa Barbara, CA
Sui Tang
Sui Tang
University of California Santa Barbara
Mathematics of Data ScienceApplied and Computational Harmonic analysisSignal Processing