🤖 AI Summary
Existing graph pooling methods struggle to simultaneously preserve graph structure, enable adaptive clustering, and maintain computational efficiency. To address this, we propose SpaPool—a soft-assignment pooling method that synergistically integrates dense intra-cluster and sparse inter-cluster strategies. Its core innovations include: (i) a learnable soft partitioning mechanism that adaptively clusters nodes into dynamically determined numbers of clusters, avoiding structural distortion caused by hard assignments; and (ii) explicit modeling of dense intra-cluster interactions and sparse inter-cluster connections during aggregation, thereby balancing representational power and computational efficiency. Extensive experiments demonstrate that SpaPool matches or surpasses state-of-the-art pooling methods across multiple benchmark datasets. Notably, it achieves significant accuracy gains on small-scale graphs for both classification and regression tasks, while reducing memory consumption and computational overhead—validating its structural preservation capability, adaptivity, and practical utility.
📝 Abstract
This paper introduces SpaPool, a novel pooling method that combines the strengths of both dense and sparse techniques for a graph neural network. SpaPool groups vertices into an adaptive number of clusters, leveraging the benefits of both dense and sparse approaches. It aims to maintain the structural integrity of the graph while reducing its size efficiently. Experimental results on several datasets demonstrate that SpaPool achieves competitive performance compared to existing pooling techniques and excels particularly on small-scale graphs. This makes SpaPool a promising method for applications requiring efficient and effective graph processing.