π€ AI Summary
To address the distortion and poor controllability of long-range information propagation in Graph Neural Networks (GNNs), this paper proposes Port-Hamiltonian Graph Neural Networks (pH-GNNs). We introduce, for the first time, port-Hamiltonian dynamical systems as a physics-informed architectural prior for GNNs, unifying non-dissipative long-range message passing and non-conservative dynamics via a physically grounded bias mechanism. Within a continuous-time dynamical discretization framework, pH-GNNs jointly optimize message passing and energy conservation constraints. Theoretically, the model guarantees a balanced trade-off between information flow conservation and stability. Practically, it requires only a lightweight GCN backbone yet achieves state-of-the-art performance across multiple long-range graph benchmarks, significantly improving robustness and accuracy for propagation beyond six hops.
π Abstract
The dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning, especially when considering long-range propagation. This calls for principled approaches that control and regulate the degree of propagation and dissipation of information throughout the neural flow. Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks, a novel framework that models neural information flow in graphs by building on the laws of conservation of Hamiltonian dynamical systems. We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors, introducing tools from mechanical systems to gauge the equilibrium between the two components. Our approach can be applied to general message-passing architectures, and it provides theoretical guarantees on information conservation in time. Empirical results prove the effectiveness of our port-Hamiltonian scheme in pushing simple graph convolutional architectures to state-of-the-art performance in long-range benchmarks.