Position: Message-passing and spectral GNNs are two sides of the same coin

📅 2026-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the long-standing perception in graph neural network research that message-passing neural networks (MPNNs) and spectral methods are fundamentally incompatible, which has impeded theoretical unification. By adopting the perspective of permutation-equivariant operators, the paper establishes a unified framework that reveals the expressive equivalence between these two paradigms while highlighting their complementary strengths: MPNNs excel at capturing local structural patterns, whereas spectral approaches are better suited for analyzing signal smoothness, stability, and community structure. Drawing on tools from graph signal processing, spectral graph theory, and logical expressivity analysis, this study systematically elucidates the intrinsic connections and distinct applicability boundaries of both model classes, thereby advancing the integration and deepening of theoretical foundations in graph representation learning.

Technology Category

Application Category

📝 Abstract
Graph neural networks (GNNs) are commonly divided into message-passing neural networks (MPNNs) and spectral graph neural networks, reflecting two largely separate research traditions in machine learning and signal processing. This paper argues that this divide is mostly artificial, hindering progress in the field. We propose a viewpoint in which both MPNNs and spectral GNNs are understood as different parametrizations of permutation-equivariant operators acting on graph signals. From this perspective, many popular architectures are equivalent in expressive power, while genuine gaps arise only in specific regimes. We further argue that MPNNs and spectral GNNs offer complementary strengths. That is, MPNNs provide a natural language for discrete structure and expressivity analysis using tools from logic and graph isomorphism research, while the spectral perspective provides principled tools for understanding smoothing, bottlenecks, stability, and community structure. Overall, we posit that progress in graph learning will be accelerated by clearly understanding the key similarities and differences between these two types of GNNs, and by working towards unifying these perspectives within a common theoretical and conceptual framework rather than treating them as competing paradigms.
Problem

Research questions and friction points this paper is trying to address.

message-passing GNNs
spectral GNNs
graph neural networks
permutation-equivariant operators
unified framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

message-passing GNNs
spectral GNNs
permutation-equivariant operators
graph signal processing
unified framework
🔎 Similar Papers
No similar papers found.