🤖 AI Summary
This work addresses the challenges of over-smoothing—where node representations converge—and over-squashing—where long-range information propagation is impeded—in deep graph neural networks (GNNs), both rooted in the underlying graph topology. From a computational complexity perspective, the paper formulates the mitigation of these issues as a graph optimization problem guided by spectral gap and graph conductance. It establishes, for the first time, that the optimal graph rewiring problem is NP-hard, with its decision version being NP-complete. Through spectral graph analysis and a reduction from the minimum bisection problem, the study reveals fundamental theoretical limits of graph rewiring approaches, delineating the theoretical boundary for enhancing GNN performance via structural optimization and thereby justifying the use of approximate or heuristic strategies in practice.
📝 Abstract
Graph Neural Networks (GNNs) face two fundamental challenges when scaled to deep architectures: oversmoothing, where node representations converge to indistinguishable vectors, and oversquashing, where information from distant nodes fails to propagate through bottlenecks. Both phenomena are intimately tied to the underlying graph structure, raising a natural question: can we optimize the graph topology to mitigate these issues? This paper provides a theoretical investigation of the computational complexity of such graph structure optimization. We formulate oversmoothing and oversquashing mitigation as graph optimization problems based on spectral gap and conductance, respectively. We prove that exact optimization for either problem is NP-hard through reductions from Minimum Bisection, establishing NP-completeness of the decision versions. Our results provide theoretical foundations for understanding the fundamental limits of graph rewiring for GNN optimization and justify the use of approximation algorithms and heuristic methods in practice.