π€ AI Summary
This work proposes the first unified graph limit framework capable of handling both sparse and dense graphs of arbitrary size, overcoming the limitations of existing graph neural network theories that rely on restrictive assumptions about graph density or scale. By constructing a compact metric space and extending the graphon operator (graphop) analysis, the authors establish HΓΆlder continuity of message-passing graph neural networks within this space. This theoretical foundation yields a stronger universal approximation theorem and tighter generalization error bounds, significantly outperforming prior results that treat sparse and dense graphs separately.
π Abstract
Generalization and approximation capabilities of message passing graph neural networks (MPNNs) are often studied by defining a compact metric on a space of input graphs under which MPNNs are H\"older continuous. Such analyses are of two varieties: 1) when the metric space includes graphs of unbounded sizes, the theory is only appropriate for dense graphs, and, 2) when studying sparse graphs, the metric space only includes graphs of uniformly bounded size. In this work, we present a unified approach, defining a compact metric on the space of graphs of all sizes, both sparse and dense, under which MPNNs are H\"older continuous. This leads to more powerful universal approximation theorems and generalization bounds than previous works. The theory is based on, and extends, a recent approach to graph limit theory called graphop analysis.