🤖 AI Summary
In link prediction, conventional GNNs adopt a homogeneous modeling paradigm for all node pairs, ignoring the inherent heterogeneity in their pairwise feature requirements—thereby limiting predictive performance. This work first identifies and quantifies substantial heterogeneity across node pairs in terms of requisite pairwise features (e.g., common neighbors, shortest path distance). To address this, we propose Link-MoE: a dynamic GNN architecture based on Mixture of Experts (MoE), wherein multiple specialized GNNs serve as experts, and a learnable gating mechanism adaptively selects the most suitable expert for each node pair during inference. Evaluated on Pubmed and ogbl-ppa, Link-MoE achieves +18.71% improvement in MRR and +9.59% in Hits@100 over prior state-of-the-art methods. Our core contribution lies in formally characterizing and explicitly modeling pairwise feature heterogeneity—a paradigm shift from the prevailing homogeneous GNN design for link prediction.
📝 Abstract
Link prediction, which aims to forecast unseen connections in graphs, is a fundamental task in graph machine learning. Heuristic methods, leveraging a range of different pairwise measures such as common neighbors and shortest paths, often rival the performance of vanilla Graph Neural Networks (GNNs). Therefore, recent advancements in GNNs for link prediction (GNN4LP) have primarily focused on integrating one or a few types of pairwise information. In this work, we reveal that different node pairs within the same dataset necessitate varied pairwise information for accurate prediction and models that only apply the same pairwise information uniformly could achieve suboptimal performance. As a result, we propose a simple mixture of experts model Link-MoE for link prediction. Link-MoE utilizes various GNNs as experts and strategically selects the appropriate expert for each node pair based on various types of pairwise information. Experimental results across diverse real-world datasets demonstrate substantial performance improvement from Link-MoE. Notably, Link-MoE achieves a relative improvement of 18.71% on the MRR metric for the Pubmed dataset and 9.59% on the Hits@100 metric for the ogbl-ppa dataset, compared to the best baselines.