🤖 AI Summary
This work addresses the over-smoothing problem in deep graph neural networks (GNNs), where node representations become indistinguishable due to homogenization, thereby losing discriminative power. For the first time, bifurcation theory is introduced into GNN analysis, and a non-monotonic activation function is proposed to replace the conventional ReLU. This substitution induces bifurcations that destabilize homogeneous fixed points, leading to heterogeneous stable patterns resilient to over-smoothing. Leveraging Lyapunov–Schmidt reduction and nonlinear dynamical systems theory, the authors rigorously derive a scaling law for pattern amplitudes and a closed-form initialization strategy. Extensive experiments on real-world benchmark datasets quantitatively validate the theoretical predictions and demonstrate significant performance improvements in deep GNNs.
📝 Abstract
Graph Neural Networks (GNNs) learn node representations through iterative network-based message-passing. While powerful, deep GNNs suffer from oversmoothing, where node features converge to a homogeneous, non-informative state. We re-frame this problem of representational collapse from a \emph{bifurcation theory} perspective, characterizing oversmoothing as convergence to a stable ``homogeneous fixed point.'' Our central contribution is the theoretical discovery that this undesired stability can be broken by replacing standard monotone activations (e.g., ReLU) with a class of functions. Using Lyapunov-Schmidt reduction, we analytically prove that this substitution induces a bifurcation that destabilizes the homogeneous state and creates a new pair of stable, non-homogeneous \emph{patterns} that provably resist oversmoothing. Our theory predicts a precise, nontrivial scaling law for the amplitude of these emergent patterns, which we quantitatively validate in experiments. Finally, we demonstrate the practical utility of our theory by deriving a closed-form, bifurcation-aware initialization and showing its utility in real benchmark experiments.