🤖 AI Summary
This work addresses the limitation of conventional neural networks, which model only pairwise interactions and thus fail to capture higher-order couplings among neurons. While existing approaches to higher-order modeling often rely on explicit hypergraph structures—restricting their generality—this paper proposes a novel method for incorporating higher-order interactions into general feedforward architectures. The key innovation lies in the first introduction of spectral-domain parameterization into feedforward networks, enabling the effective encoding and propagation of higher-order connectivity through spectral reconstruction of network parameters. Notably, the method operates without requiring input hypergraph structures and integrates tensor-based interaction modeling with stability-enhancing optimization techniques. As a result, it substantially enhances both representational capacity and training stability, yielding a scalable, general-purpose, and robust framework for higher-order feedforward neural networks.
📝 Abstract
Neural networks are fundamental tools of modern machine learning. The standard paradigm assumes binary interactions (across feedforward linear passes) between inter-tangled units, organized in sequential layers. Generalized architectures have been also designed that move beyond pairwise interactions, so as to account for higher-order couplings among computing neurons. Higher-order networks are however usually deployed as augmented graph neural networks (GNNs), and, as such, prove solely advantageous in contexts where the input exhibits an explicit hypergraph structure. Here, we present Spectral Higher-Order Neural Networks (SHONNs), a new algorithmic strategy to incorporate higher-order interactions in general-purpose, feedforward, network structures. SHONNs leverages a reformulation of the model in terms of spectral attributes. This allows to mitigate the common stability and parameter scaling problems that come along weighted, higher-order, forward propagations.