Geometry-Aware Edge Pooling for Graph Neural Networks

📅 2025-06-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph neural network (GNN) pooling methods often compromise graph structural fidelity and interpretability, leading to unstable performance across datasets, tasks, and pooling ratios. To address this, we propose a geometry-aware edge-collapsing pooling method. Our approach is the first to adopt “magnitude”—a concept from diffusion geometry—as a measure of structural diversity, and introduces “metric spread” to enhance numerical stability and computational efficiency. By integrating spectral-preserving optimization with edge-collapsing operations, the method significantly retains the original graph’s metric structure and spectral properties while reducing its scale. Extensive experiments demonstrate that our method consistently outperforms state-of-the-art pooling baselines across diverse graph classification benchmarks. Moreover, it maintains high accuracy and strong robustness under varying pooling ratios, offering improved generalizability and reliability for downstream GNN applications.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) have shown significant success for graph-based tasks. Motivated by the prevalence of large datasets in real-world applications, pooling layers are crucial components of GNNs. By reducing the size of input graphs, pooling enables faster training and potentially better generalisation. However, existing pooling operations often optimise for the learning task at the expense of fundamental graph structures and interpretability. This leads to unreliable performance across varying dataset types, downstream tasks and pooling ratios. Addressing these concerns, we propose novel graph pooling layers for structure aware pooling via edge collapses. Our methods leverage diffusion geometry and iteratively reduce a graph's size while preserving both its metric structure and structural diversity. We guide pooling using magnitude, an isometry-invariant diversity measure, which permits us to control the fidelity of the pooling process. Further, we use the spread of a metric space as a faster and more stable alternative ensuring computational efficiency. Empirical results demonstrate that our methods (i) achieve superior performance compared to alternative pooling layers across a range of diverse graph classification tasks, (ii) preserve key spectral properties of the input graphs, and (iii) retain high accuracy across varying pooling ratios.
Problem

Research questions and friction points this paper is trying to address.

Optimize graph pooling while preserving fundamental structures
Maintain interpretability across varying datasets and tasks
Balance computational efficiency with structural diversity preservation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Edge collapses for structure-aware graph pooling
Diffusion geometry preserves metric and diversity
Isometry-invariant magnitude guides pooling fidelity
🔎 Similar Papers
No similar papers found.