🤖 AI Summary
This paper addresses the challenge of effectively modeling higher-order interactions—beyond pairwise—in relational learning. While hypergraphs naturally represent such interactions, their integration with graph neural networks (GNNs) lacks systematic architectural and theoretical analysis. To bridge this gap, the authors propose a lightweight higher-order structural encoding grounded in classical hypergraph spectral features. Crucially, they provide the first theoretical proof that incorporating this encoding into standard message-passing GNNs strictly enhances their expressive power. Empirical evaluation demonstrates that augmenting graph-level GNNs with this hypergraph encoding consistently outperforms state-of-the-art hypergraph neural networks across multiple benchmarks—achieving superior accuracy, lower computational complexity, and improved generalization. The core contribution lies in establishing that *encoding higher-order information* into conventional GNNs is more efficient and principled than designing specialized hypergraph architectures—backed by both rigorous theoretical guarantees and practical implementation.
📝 Abstract
Higher-order information is crucial for relational learning in many domains where relationships extend beyond pairwise interactions. Hypergraphs provide a natural framework for modeling such relationships, which has motivated recent extensions of graph neural net- work architectures to hypergraphs. However, comparisons between hypergraph architectures and standard graph-level models remain limited. In this work, we systematically evaluate a selection of hypergraph-level and graph-level architectures, to determine their effectiveness in leveraging higher-order information in relational learning. Our results show that graph-level architectures applied to hypergraph expansions often outperform hypergraph- level ones, even on inputs that are naturally parametrized as hypergraphs. As an alternative approach for leveraging higher-order information, we propose hypergraph-level encodings based on classical hypergraph characteristics. While these encodings do not significantly improve hypergraph architectures, they yield substantial performance gains when combined with graph-level models. Our theoretical analysis shows that hypergraph-level encodings provably increase the representational power of message-passing graph neural networks beyond that of their graph-level counterparts.