🤖 AI Summary
To address pervasive over-smoothing and vanishing-gradient issues in deep graph neural networks (GNNs), this paper proposes the complex-valued Stuart–Landau Graph Neural Network (SLGNN). SLGNN is the first GNN to incorporate Stuart–Landau oscillator dynamics, modeling both amplitude and phase evolution of node features via complex-valued states; its stable, tunable nonlinear propagation arises from limit-cycle behavior near the Hopf bifurcation. Unlike Kuramoto-based models—restricted to phase-only dynamics—SLGNN offers enhanced expressivity and joint structural-feature modulation capability. The model employs Euler discretization and message passing to construct a multi-layer architecture. Extensive experiments on node classification, graph classification, and graph regression tasks demonstrate that SLGNN significantly outperforms existing oscillatory GNNs. This work establishes a new paradigm for deep graph learning that combines theoretical rigor—with foundations in nonlinear dynamical systems—with empirical effectiveness.
📝 Abstract
Oscillatory Graph Neural Networks (OGNNs) are an emerging class of physics-inspired architectures designed to mitigate oversmoothing and vanishing gradient problems in deep GNNs. In this work, we introduce the Complex-Valued Stuart-Landau Graph Neural Network (SLGNN), a novel architecture grounded in Stuart-Landau oscillator dynamics. Stuart-Landau oscillators are canonical models of limit-cycle behavior near Hopf bifurcations, which are fundamental to synchronization theory and are widely used in e.g. neuroscience for mesoscopic brain modeling. Unlike harmonic oscillators and phase-only Kuramoto models, Stuart-Landau oscillators retain both amplitude and phase dynamics, enabling rich phenomena such as amplitude regulation and multistable synchronization. The proposed SLGNN generalizes existing phase-centric Kuramoto-based OGNNs by allowing node feature amplitudes to evolve dynamically according to Stuart-Landau dynamics, with explicit tunable hyperparameters (such as the Hopf-parameter and the coupling strength) providing additional control over the interplay between feature amplitudes and network structure. We conduct extensive experiments across node classification, graph classification, and graph regression tasks, demonstrating that SLGNN outperforms existing OGNNs and establishes a novel, expressive, and theoretically grounded framework for deep oscillatory architectures on graphs.