🤖 AI Summary
Existing link prediction methods suffer from redundancy and oversmoothing when modeling multi-order common neighbors, leading to degraded representation capability for higher-order features. To address this, we propose the Orthogonal Common Neighbor (OCN) paradigm: (1) we introduce orthogonalization to eliminate linear redundancy among common neighbors of different orders; (2) we integrate normalization to mitigate oversmoothing; and (3) we formulate a differentiable, interpretable framework for high-order common neighbor modeling. Theoretical analysis reveals that OCN preserves discriminative neighborhood structural information. On mainstream benchmarks, OCN achieves an average improvement of 7.7% over the strongest baseline. Ablation studies comprehensively validate the effectiveness and necessity of each component.
📝 Abstract
Common Neighbors (CNs) and their higher-order variants are important pairwise features widely used in state-of-the-art link prediction methods. However, existing methods often struggle with the repetition across different orders of CNs and fail to fully leverage their potential. We identify that these limitations stem from two key issues: redundancy and over-smoothing in high-order common neighbors. To address these challenges, we design orthogonalization to eliminate redundancy between different-order CNs and normalization to mitigate over-smoothing. By combining these two techniques, we propose Orthogonal Common Neighbor (OCN), a novel approach that significantly outperforms the strongest baselines by an average of 7.7% on popular link prediction benchmarks. A thorough theoretical analysis is provided to support our method. Ablation studies also verify the effectiveness of our orthogonalization and normalization techniques.