Fractional Heat Kernel for Semi-Supervised Graph Learning with Small Training Sample Size

📅 2025-10-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited effectiveness of label propagation and self-training in few-shot semi-supervised graph learning, this paper proposes a novel algorithm grounded in source-term-incorporated fractional-order heat kernel dynamics. Methodologically, it introduces the fractional power of the graph Laplacian into diffusion models, enabling global, non-local multi-hop label propagation via variational principles. An efficient and scalable computational framework for the fractional heat kernel is developed by integrating Chebyshev polynomial approximation with GCN and GAT architectures. The approach significantly enhances model expressivity and robustness under extremely low labeling rates (e.g., 1%–5%), outperforming conventional diffusion methods and state-of-the-art GNNs across multiple standard graph benchmarks. Key contributions include: (i) the first formulation of fractional-order diffusion dynamics with explicit source terms for semi-supervised graph learning; (ii) a theoretically principled, computationally tractable framework unifying spectral and spatial GNN designs; and (iii) empirical gains in accuracy and stability under severe label scarcity.

Technology Category

Application Category

📝 Abstract
In this work, we introduce novel algorithms for label propagation and self-training using fractional heat kernel dynamics with a source term. We motivate the methodology through the classical correspondence of information theory with the physics of parabolic evolution equations. We integrate the fractional heat kernel into Graph Neural Network architectures such as Graph Convolutional Networks and Graph Attention, enhancing their expressiveness through adaptive, multi-hop diffusion. By applying Chebyshev polynomial approximations, large graphs become computationally feasible. Motivating variational formulations demonstrate that by extending the classical diffusion model to fractional powers of the Laplacian, nonlocal interactions deliver more globally diffusing labels. The particular balance between supervision of known labels and diffusion across the graph is particularly advantageous in the case where only a small number of labeled training examples are present. We demonstrate the effectiveness of this approach on standard datasets.
Problem

Research questions and friction points this paper is trying to address.

Enhancing graph learning with limited labeled training data
Improving label propagation using fractional heat kernel dynamics
Enabling nonlocal interactions through fractional Laplacian diffusion
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses fractional heat kernel dynamics for graph learning
Integrates fractional kernel into Graph Neural Networks
Applies Chebyshev polynomial approximations for large graphs
🔎 Similar Papers
No similar papers found.
F
Farid Bozorgnia
Vyacheslav Kungurtsev
Vyacheslav Kungurtsev
Czech Technical University in Prague
S
Shirali Kadyrov
M
Mohsen Yousefnezhad