Rethinking Message Passing Neural Networks with Diffusion Distance-guided Stress Majorization

📅 2025-11-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Message Passing Neural Networks (MPNNs) suffer from over-smoothing and over-correlation due to Dirichlet energy minimization, leading to representation collapse and degraded discriminability. Method: We propose Diffusion-Distance-Guided Stress Optimization (D²SO), a novel message-passing framework that replaces conventional neighborhood aggregation with diffusion distance—a robust, global structural metric—and integrates stress-based optimization with orthogonal regularization to preserve discriminative node representations. Theoretically, D²SO mitigates representation collapse while requiring no structural priors. Contribution/Results: D²SO unifies treatment of both homophilic and heterophilic graphs. Evaluated on 15 strong benchmark datasets, it consistently outperforms state-of-the-art methods by significant margins, achieving top performance in node classification tasks. Extensive experiments validate its generalizability, robustness, and effectiveness across diverse graph topologies.

Technology Category

Application Category

📝 Abstract
Message passing neural networks (MPNNs) have emerged as go-to models for learning on graph-structured data in the past decade. Despite their effectiveness, most of such models still incur severe issues such as over-smoothing and -correlation, due to their underlying objective of minimizing the Dirichlet energy and the derived neighborhood aggregation operations. In this paper, we propose the DDSM, a new MPNN model built on an optimization framework that includes the stress majorization and orthogonal regularization for overcoming the above issues. Further, we introduce the diffusion distances for nodes into the framework to guide the new message passing operations and develop efficient algorithms for distance approximations, both backed by rigorous theoretical analyses. Our comprehensive experiments showcase that DDSM consistently and considerably outperforms 15 strong baselines on both homophilic and heterophilic graphs.
Problem

Research questions and friction points this paper is trying to address.

Overcoming over-smoothing and over-correlation in graph neural networks
Redesigning message passing using diffusion distance-guided optimization
Improving performance on both homophilic and heterophilic graph datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diffusion distance-guided stress majorization optimization framework
Orthogonal regularization prevents over-smoothing issues
Efficient algorithms for diffusion distance approximations
🔎 Similar Papers
No similar papers found.
Haoran Zheng
Haoran Zheng
Hong Kong Baptist University
Graph learningGraph algorithms
R
Renchi Yang
Hong Kong Baptist University, Hong Kong SAR, China
Yubo Zhou
Yubo Zhou
University of Electornic and Technology of China
Medical Image AnalysisSelf-supervised Learning
J
Jianliang Xu
Hong Kong Baptist University, Hong Kong SAR, China