Graph Kernel Neural Networks

📅 2021-12-14
🏛️ IEEE Transactions on Neural Networks and Learning Systems
📈 Citations: 25
Influential: 1
📄 PDF
🤖 AI Summary
Standard convolutional operations cannot be directly applied to graph-structured data due to its irregular, non-Euclidean topology. Method: This paper proposes Graph Kernel-driven Learnable Structural Convolution (GK-Conv), a purely structural, end-to-end modeling framework operating directly on non-Euclidean graph domains. GK-Conv eliminates explicit graph embedding and instead constructs a parameterized, structural convolutional operator grounded in generic graph kernel functions—enabling plug-and-play integration of arbitrary graph kernels and generating CNN-style, interpretable structural masks. The model is fully differentiable and optimized via ablation-guided hyperparameter analysis. Contribution/Results: GK-Conv achieves state-of-the-art performance across multiple graph classification and regression benchmarks, empirically validating the central claim that strong generalization can be attained using topology alone—without node or edge features.
📝 Abstract
The convolution operator at the core of many modern neural architectures can effectively be seen as performing a dot product between an input matrix and a filter. While this is readily applicable to data such as images, which can be represented as regular grids in the Euclidean space, extending the convolution operator to work on graphs proves more challenging, due to their irregular structure. In this article, we propose to use graph kernels, i.e., kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain. This allows us to define an entirely structural model that does not require computing the embedding of the input graph. Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability in terms of the structural masks that are learned during the training process, similar to what happens for convolutional masks in traditional convolutional neural networks (CNNs). We perform an extensive ablation study to investigate the model hyperparameters’ impact and show that our model achieves competitive performance on standard graph classification and regression datasets.
Problem

Research questions and friction points this paper is trying to address.

Extending convolution operators to irregular graph structures
Developing graph kernels for structural learning without embeddings
Providing interpretable structural masks through graph kernel networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses graph kernels for convolution on graphs
Eliminates need for input graph embeddings
Allows plug-in of various graph kernel types
🔎 Similar Papers
No similar papers found.
Luca Cosmo
Luca Cosmo
Ca' Foscari University of Venice
Computer VisionMachine LearningGeometric Deep Learning
G
G. Minello
Ca’ Foscari University of Venice, Venice, Italy
M
M. Bronstein
Oxford University, United Kingdom
E
E. Rodolà
Sapienza University of Rome, Rome, Italy
L
L. Rossi
The Hong Kong Polytechnic University, Hong Kong
A
A. Torsello
Ca’ Foscari University of Venice, Venice, Italy