Beyond ReLU: Bifurcation, Oversmoothing, and Topological Priors

📅 2026-02-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the over-smoothing problem in deep graph neural networks (GNNs), where node representations become indistinguishable due to homogenization, thereby losing discriminative power. For the first time, bifurcation theory is introduced into GNN analysis, and a non-monotonic activation function is proposed to replace the conventional ReLU. This substitution induces bifurcations that destabilize homogeneous fixed points, leading to heterogeneous stable patterns resilient to over-smoothing. Leveraging Lyapunov–Schmidt reduction and nonlinear dynamical systems theory, the authors rigorously derive a scaling law for pattern amplitudes and a closed-form initialization strategy. Extensive experiments on real-world benchmark datasets quantitatively validate the theoretical predictions and demonstrate significant performance improvements in deep GNNs.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) learn node representations through iterative network-based message-passing. While powerful, deep GNNs suffer from oversmoothing, where node features converge to a homogeneous, non-informative state. We re-frame this problem of representational collapse from a \emph{bifurcation theory} perspective, characterizing oversmoothing as convergence to a stable ``homogeneous fixed point.'' Our central contribution is the theoretical discovery that this undesired stability can be broken by replacing standard monotone activations (e.g., ReLU) with a class of functions. Using Lyapunov-Schmidt reduction, we analytically prove that this substitution induces a bifurcation that destabilizes the homogeneous state and creates a new pair of stable, non-homogeneous \emph{patterns} that provably resist oversmoothing. Our theory predicts a precise, nontrivial scaling law for the amplitude of these emergent patterns, which we quantitatively validate in experiments. Finally, we demonstrate the practical utility of our theory by deriving a closed-form, bifurcation-aware initialization and showing its utility in real benchmark experiments.
Problem

Research questions and friction points this paper is trying to address.

oversmoothing
graph neural networks
representational collapse
bifurcation theory
homogeneous fixed point
Innovation

Methods, ideas, or system contributions that make the work stand out.

bifurcation theory
oversmoothing
non-monotonic activation
graph neural networks
Lyapunov-Schmidt reduction
🔎 Similar Papers
No similar papers found.
Erkan Turan
Erkan Turan
École Polytechnique
Machine Learning
G
Gaspard Abel
Université Paris Saclay, Université Paris Cité, ENS Paris Saclay, CNRS, SSA, INSERM, Centre Borelli, F-91190, Gif-sur-Yvette, France; Centre d’Analyse et de Mathématique Sociales, EHESS, CNRS, 75006 Paris, France
Maysam Behmanesh
Maysam Behmanesh
École Polytechnique
Machine LearningGeometric Deep LearningGraph Neural NetworksMultimodal Learning
E
Emery Pierson
LIX, Ecole Polytechnique, IP Paris
Maks Ovsjanikov
Maks Ovsjanikov
Ecole Polytechnique; Google DeepMind
3D Computer VisionGeometry ProcessingShape AnalysisShape Matching