🤖 AI Summary
To address performance degradation in graph neural networks (GNNs) on heterogeneous graphs caused by self-enhancement and phase inconsistency, this paper proposes an interference-aware complex-valued GNN. Methodologically, it introduces U(1)-phase coupling and rank-1 projection to eliminate self-interference; designs a sign-phase joint gating mechanism to suppress low-frequency interference; and replaces conventional additive aggregation with gauge-equivariant complex modeling and phase-sensitive attention-based pre-attenuation. The core contribution lies in the first systematic integration of physics-inspired interference suppression principles into the message-passing framework. Extensive experiments demonstrate consistent state-of-the-art performance across multiple heterogeneous graph benchmarks, with significant improvements in generalization capability and training stability. This work establishes a novel paradigm for heterogeneous graph representation learning.
📝 Abstract
Graph Neural Networks (GNNs) excel on homophilous graphs but often fail under heterophily due to self-reinforcing and phase-inconsistent signals. We propose a Gauge-Equivariant Graph Network with Self-Interference Cancellation (GESC), which replaces additive aggregation with a projection-based interference mechanism. Unlike prior magnetic or gauge-equivariant GNNs that typically focus on phase handling in spectral filtering while largely relying on scalar weighting, GESC introduces a $mathrm{U}(1)$ phase connection followed by a rank-1 projection that attenuates self-parallel components before attention. A sign- and phase-aware gate further regulates neighbor influence, attenuating components aligned with current node states and acting as a local notch on low-frequency modes. Across diverse graph benchmarks, our method consistently outperforms recent state-of-the-art models while offering a unified, interference-aware view of message passing. Our code is available at href{here}{https://anonymous.4open.science/r/GESC-1B22}.