🤖 AI Summary
Existing graph contrastive learning methods rely on handcrafted, fixed views (e.g., local/global), limiting their ability to adaptively capture multi-scale structural patterns. To address this, we propose an adaptive multi-view graph contrastive learning framework based on fractional-order neural diffusion networks. Our method employs a learnable fractional-order derivative α ∈ (0,1] to continuously modulate the scope of information propagation, enabling end-to-end generation of diverse node representations spanning local to global scales. By parameterizing diffusion scales as a continuous dynamical process, it eliminates dependence on discrete view design and manual data augmentation. Integrating fractional calculus, continuous-time graph neural networks, and contrastive learning, our approach significantly enhances representation discriminability and robustness. Extensive experiments demonstrate consistent and substantial improvements over state-of-the-art graph contrastive learning methods across standard benchmarks.
📝 Abstract
Graph contrastive learning (GCL) learns node and graph representations by contrasting multiple views of the same graph. Existing methods typically rely on fixed, handcrafted views-usually a local and a global perspective, which limits their ability to capture multi-scale structural patterns. We present an augmentation-free, multi-view GCL framework grounded in fractional-order continuous dynamics. By varying the fractional derivative order $alpha in (0,1]$, our encoders produce a continuous spectrum of views: small $alpha$ yields localized features, while large $alpha$ induces broader, global aggregation. We treat $alpha$ as a learnable parameter so the model can adapt diffusion scales to the data and automatically discover informative views. This principled approach generates diverse, complementary representations without manual augmentations. Extensive experiments on standard benchmarks demonstrate that our method produces more robust and expressive embeddings and outperforms state-of-the-art GCL baselines.