Toward Effective Digraph Representation Learning: A Magnetic Adaptive Propagation based Approach

📅 2025-01-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing magnetically enhanced directed graph neural networks (MagDG) rely on manually tuned hyperparameters (e.g., the complex-phase parameter *q*) for message passing in the complex domain, resulting in inflexible architectures, coarse-grained information propagation, and neglect of node- and edge-level contextual semantics. To address these limitations, we propose an adaptive complex-domain propagation mechanism: (i) a plug-and-play Magnetic Adaptive Propagation (MAP) module compatible with arbitrary MagDG backbones; and (ii) an end-to-end differentiable framework, MAP++, integrating magnetic Laplacian-based complex graph convolution, parameterized adaptive propagation, differentiable edge-weight learning, and node-aware aggregation. Extensive experiments across 12 benchmark datasets demonstrate strong efficiency and scalability. MAP++ achieves state-of-the-art performance across four diverse downstream tasks—including node classification, link prediction, graph classification, and directed graph generation—validating its effectiveness and generalizability.

Technology Category

Application Category

📝 Abstract
The $q$-parameterized magnetic Laplacian serves as the foundation of directed graph (digraph) convolution, enabling this kind of digraph neural network (MagDG) to encode node features and structural insights by complex-domain message passing. As a generalization of undirected methods, MagDG shows superior capability in modeling intricate web-scale topology. Despite the great success achieved by existing MagDGs, limitations still exist: (1) Hand-crafted $q$: The performance of MagDGs depends on selecting an appropriate $q$-parameter to construct suitable graph propagation equations in the complex domain. This parameter tuning, driven by downstream tasks, limits model flexibility and significantly increases manual effort. (2) Coarse Message Passing: Most approaches treat all nodes with the same complex-domain propagation and aggregation rules, neglecting their unique digraph contexts. This oversight results in sub-optimal performance. To address the above issues, we propose two key techniques: (1) MAP is crafted to be a plug-and-play complex-domain propagation optimization strategy in the context of digraph learning, enabling seamless integration into any MagDG to improve predictions while enjoying high running efficiency. (2) MAP++ is a new digraph learning framework, further incorporating a learnable mechanism to achieve adaptively edge-wise propagation and node-wise aggregation in the complex domain for better performance. Extensive experiments on 12 datasets demonstrate that MAP enjoys flexibility for it can be incorporated with any MagDG, and scalability as it can deal with web-scale digraphs. MAP++ achieves SOTA predictive performance on 4 different downstream tasks.
Problem

Research questions and friction points this paper is trying to address.

Parameter Adjustment
Information Propagation
Directed Graph Neural Networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

MAP
MAP++
MagDG Optimization
🔎 Similar Papers
No similar papers found.