Unsupervised Graph Clustering with Deep Structural Entropy

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph neural networks (GNNs), graph attention networks (GATs), and contrastive learning methods suffer from degraded clustering performance on sparse or noisy graphs due to low-quality adjacency matrices and the decoupling of embedding learning from clustering. To address this, we propose DeSE, an end-to-end unsupervised graph clustering framework. Its core contributions are threefold: (1) a novel differentiable deep structural entropy that quantifies and optimizes structural information in graphs; (2) a structure learning layer (SLL) that autonomously generates attribute-augmented graphs to strengthen sparse topologies; and (3) a stackable soft-assignment structural entropy (ASS) layer that jointly minimizes structural entropy and edge-aware cross-entropy. Evaluated on four benchmark datasets, DeSE consistently outperforms eight state-of-the-art methods, achieving superior clustering accuracy while preserving structural interpretability.

Technology Category

Application Category

📝 Abstract
Research on Graph Structure Learning (GSL) provides key insights for graph-based clustering, yet current methods like Graph Neural Networks (GNNs), Graph Attention Networks (GATs), and contrastive learning often rely heavily on the original graph structure. Their performance deteriorates when the original graph's adjacency matrix is too sparse or contains noisy edges unrelated to clustering. Moreover, these methods depend on learning node embeddings and using traditional techniques like k-means to form clusters, which may not fully capture the underlying graph structure between nodes. To address these limitations, this paper introduces DeSE, a novel unsupervised graph clustering framework incorporating Deep Structural Entropy. It enhances the original graph with quantified structural information and deep neural networks to form clusters. Specifically, we first propose a method for calculating structural entropy with soft assignment, which quantifies structure in a differentiable form. Next, we design a Structural Learning layer (SLL) to generate an attributed graph from the original feature data, serving as a target to enhance and optimize the original structural graph, thereby mitigating the issue of sparse connections between graph nodes. Finally, our clustering assignment method (ASS), based on GNNs, learns node embeddings and a soft assignment matrix to cluster on the enhanced graph. The ASS layer can be stacked to meet downstream task requirements, minimizing structural entropy for stable clustering and maximizing node consistency with edge-based cross-entropy loss. Extensive comparative experiments are conducted on four benchmark datasets against eight representative unsupervised graph clustering baselines, demonstrating the superiority of the DeSE in both effectiveness and interpretability.
Problem

Research questions and friction points this paper is trying to address.

Improves clustering on sparse or noisy graphs
Quantifies graph structure with deep structural entropy
Enhances node embeddings and cluster assignments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep Structural Entropy for graph clustering
Structural Learning layer enhances sparse graphs
GNN-based clustering with soft assignment matrix
🔎 Similar Papers
No similar papers found.