Analysis of Semi-Supervised Learning on Hypergraphs

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of rigorous theoretical foundations for hypergraph semi-supervised learning. We present the first systematic asymptotic consistency analysis of variational learning on random geometric hypergraphs, rigorously characterizing necessary and sufficient conditions for convergence to a weighted $p$-Laplacian equation. We propose Higher-Order Hypergraph Learning (HOHL), a novel framework that achieves multi-scale regularization via powers of the skeleton graph Laplacian operator; we prove that its objective functional $Gamma$-converges to a higher-order Sobolev seminorm. By unifying random geometric hypergraph modeling, spectral graph theory, and higher-order Laplacian regularization, HOHL significantly improves problem well-posedness and generalization. Extensive experiments on standard benchmark datasets validate our theoretical findings and establish new state-of-the-art performance.

Technology Category

Application Category

📝 Abstract
Hypergraphs provide a natural framework for modeling higher-order interactions, yet their theoretical underpinnings in semi-supervised learning remain limited. We provide an asymptotic consistency analysis of variational learning on random geometric hypergraphs, precisely characterizing the conditions ensuring the well-posedness of hypergraph learning as well as showing convergence to a weighted $p$-Laplacian equation. Motivated by this, we propose Higher-Order Hypergraph Learning (HOHL), which regularizes via powers of Laplacians from skeleton graphs for multiscale smoothness. HOHL converges to a higher-order Sobolev seminorm. Empirically, it performs strongly on standard baselines.
Problem

Research questions and friction points this paper is trying to address.

Analyzing semi-supervised learning consistency on hypergraphs
Characterizing conditions for well-posed hypergraph learning convergence
Proposing higher-order regularization for multiscale smoothness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Asymptotic consistency analysis on hypergraph learning
Higher-Order Hypergraph Learning with Laplacian regularization
Convergence to higher-order Sobolev seminorm formulation
🔎 Similar Papers
No similar papers found.