🤖 AI Summary
Message Passing Neural Networks (MPNNs) suffer from over-smoothing and over-correlation due to Dirichlet energy minimization, leading to representation collapse and degraded discriminability. Method: We propose Diffusion-Distance-Guided Stress Optimization (D²SO), a novel message-passing framework that replaces conventional neighborhood aggregation with diffusion distance—a robust, global structural metric—and integrates stress-based optimization with orthogonal regularization to preserve discriminative node representations. Theoretically, D²SO mitigates representation collapse while requiring no structural priors. Contribution/Results: D²SO unifies treatment of both homophilic and heterophilic graphs. Evaluated on 15 strong benchmark datasets, it consistently outperforms state-of-the-art methods by significant margins, achieving top performance in node classification tasks. Extensive experiments validate its generalizability, robustness, and effectiveness across diverse graph topologies.
📝 Abstract
Message passing neural networks (MPNNs) have emerged as go-to models for learning on graph-structured data in the past decade. Despite their effectiveness, most of such models still incur severe issues such as over-smoothing and -correlation, due to their underlying objective of minimizing the Dirichlet energy and the derived neighborhood aggregation operations. In this paper, we propose the DDSM, a new MPNN model built on an optimization framework that includes the stress majorization and orthogonal regularization for overcoming the above issues. Further, we introduce the diffusion distances for nodes into the framework to guide the new message passing operations and develop efficient algorithms for distance approximations, both backed by rigorous theoretical analyses. Our comprehensive experiments showcase that DDSM consistently and considerably outperforms 15 strong baselines on both homophilic and heterophilic graphs.