Higher-Order Regularization Learning on Hypergraphs

📅 2025-10-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional hypergraph regularization suffers from insufficient higher-order smoothness. Method: This paper proposes Truncated Higher-Order Hypergraph Learning (HOHL), which constructs a higher-order smoothness regularizer based on powers of multi-scale hypergraph Laplacian operators. Contribution/Results: Theoretically, we establish the first asymptotic consistency guarantee for truncated HOHL and derive an explicit convergence rate under full supervision. Methodologically, we extend HOHL to active learning and non-geometric structured data, significantly enhancing model generality and robustness. Empirical results demonstrate that HOHL consistently outperforms baselines across diverse tasks—particularly excelling on data lacking intrinsic geometric structure, where it exhibits superior generalization. This work systematically establishes the theoretical foundations and practical applicability boundaries of HOHL, advancing higher-order hypergraph learning beyond geometric settings toward broader learning paradigms.

Technology Category

Application Category

📝 Abstract
Higher-Order Hypergraph Learning (HOHL) was recently introduced as a principled alternative to classical hypergraph regularization, enforcing higher-order smoothness via powers of multiscale Laplacians induced by the hypergraph structure. Prior work established the well- and ill-posedness of HOHL through an asymptotic consistency analysis in geometric settings. We extend this theoretical foundation by proving the consistency of a truncated version of HOHL and deriving explicit convergence rates when HOHL is used as a regularizer in fully supervised learning. We further demonstrate its strong empirical performance in active learning and in datasets lacking an underlying geometric structure, highlighting HOHL's versatility and robustness across diverse learning settings.
Problem

Research questions and friction points this paper is trying to address.

Extending theoretical consistency analysis for truncated HOHL
Deriving explicit convergence rates in supervised learning
Demonstrating empirical performance in non-geometric datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Enforces higher-order smoothness via hypergraph Laplacians
Uses truncated HOHL with proven convergence rates
Applies regularization in supervised and active learning
🔎 Similar Papers
No similar papers found.