Parameter-Free Hypergraph Neural Network for Few-Shot Node Classification

πŸ“… 2025-10-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the limited higher-order structural modeling and poor generalization in few-shot hypergraph node classification, this paper proposes ZENβ€”the first parameter-free, fully linear hypergraph neural network. Methodologically, ZEN (1) computes the weight matrix in closed form, eliminating iterative training entirely, and (2) introduces a redundancy-aware propagation mechanism that filters spurious higher-order information while preserving linear time complexity. The model is fully interpretable, inference-efficient, and inherently immune to overfitting. Evaluated on 11 real-world hypergraph datasets, ZEN consistently outperforms eight strong baselines, achieving the highest average classification accuracy and up to 696Γ— faster single-inference speed. These results significantly advance the practicality and scalability of few-shot hypergraph learning.

Technology Category

Application Category

πŸ“ Abstract
Few-shot node classification on hypergraphs requires models that generalize from scarce labels while capturing high-order structures. Existing hypergraph neural networks (HNNs) effectively encode such structures but often suffer from overfitting and scalability issues due to complex, black-box architectures. In this work, we propose ZEN (Zero-Parameter Hypergraph Neural Network), a fully linear and parameter-free model that achieves both expressiveness and efficiency. Built upon a unified formulation of linearized HNNs, ZEN introduces a tractable closed-form solution for the weight matrix and a redundancy-aware propagation scheme to avoid iterative training and to eliminate redundant self information. On 11 real-world hypergraph benchmarks, ZEN consistently outperforms eight baseline models in classification accuracy while achieving up to 696x speedups over the fastest competitor. Moreover, the decision process of ZEN is fully interpretable, providing insights into the characteristic of a dataset. Our code and datasets are fully available at https://github.com/chaewoonbae/ZEN.
Problem

Research questions and friction points this paper is trying to address.

Addresses few-shot node classification on hypergraphs with scarce labels
Solves overfitting and scalability in hypergraph neural networks
Provides interpretable decision processes for hypergraph datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-free hypergraph neural network for few-shot learning
Closed-form weight matrix solution eliminates iterative training
Interpretable decision process with redundancy-aware propagation scheme
πŸ”Ž Similar Papers
C
Chaewoon Bae
School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST)
D
Doyun Choi
School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST)
Jaehyun Lee
Jaehyun Lee
Korea Institute of Fusion Energy (KFE)
Plasma PhysicsNuclear Fusion
Jaemin Yoo
Jaemin Yoo
Assistant Professor, KAIST
Data MiningMachine LearningGraph Neural NetworksTime Series Analysis