Complex-Weighted Convolutional Networks: Provable Expressiveness via Complex Diffusion

📅 2025-11-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inherent limitations of Graph Neural Networks (GNNs)—namely, weak expressive power and oversmoothing on heterophilous graphs—this paper proposes the Complex-Weighted Convolutional Network (CWCN). Methodologically, it introduces the first complex-domain generalization of random walks, constructing a learnable complex diffusion matrix; this is integrated with complex-valued graph weights and complex nonlinear activations, requiring no additional hyperparameters. Theoretically, we rigorously prove that complex diffusion achieves, at stationarity, an expressive upper bound sufficient to solve any node classification task—thereby overcoming fundamental modeling bottlenecks of real-valued GNNs. Empirically, CWCN attains state-of-the-art or competitive performance across multiple heterophilous benchmark datasets, demonstrating the dual advantages of complex-domain diffusion: theoretical completeness and practical efficacy.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) have achieved remarkable success across diverse applications, yet they remain limited by oversmoothing and poor performance on heterophilic graphs. To address these challenges, we introduce a novel framework that equips graphs with a complex-weighted structure, assigning each edge a complex number to drive a diffusion process that extends random walks into the complex domain. We prove that this diffusion is highly expressive: with appropriately chosen complex weights, any node-classification task can be solved in the steady state of a complex random walk. Building on this insight, we propose the Complex-Weighted Convolutional Network (CWCN), which learns suitable complex-weighted structures directly from data while enriching diffusion with learnable matrices and nonlinear activations. CWCN is simple to implement, requires no additional hyperparameters beyond those of standard GNNs, and achieves competitive performance on benchmark datasets. Our results demonstrate that complex-weighted diffusion provides a principled and general mechanism for enhancing GNN expressiveness, opening new avenues for models that are both theoretically grounded and practically effective.
Problem

Research questions and friction points this paper is trying to address.

Overcoming oversmoothing and heterophilic graph limitations in GNNs
Enhancing expressiveness through complex-weighted diffusion processes
Learning optimal complex graph structures for node classification tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Complex-weighted edges drive diffusion process
Learns complex-weighted structures directly from data
Enriches diffusion with learnable nonlinear activations
🔎 Similar Papers
No similar papers found.