Polynomial Neural Sheaf Diffusion: A Spectral Filtering Approach on Cellular Sheaves

📅 2025-11-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural layer diffusion models rely on SVD-based normalization and dense edge-restriction mappings, leading to high computational overhead, gradient instability, and performance degradation as stalk dimension increases. To address these issues, we propose Polynomial Neural Stratified Diffusion (PolyNSD): a spectral diffusion framework leveraging K-th order orthogonal polynomials of the normalized layer Laplacian to enable explicit K-hop propagation over cell complexes. PolyNSD introduces trainable convex-combination spectral responses, spectral rescaling, and residual/gated pathways, while employing only diagonal restriction mappings—thereby decoupling model performance from stalk dimension. This design effectively mitigates heterophily and oversmoothing, enhancing training stability and inference efficiency. Evaluated on both homophilic and heterophilic graph benchmarks, PolyNSD achieves state-of-the-art accuracy with significantly reduced memory consumption and runtime, eliminating reliance on SVD decomposition and dense parameterization.

Technology Category

Application Category

📝 Abstract
Sheaf Neural Networks equip graph structures with a cellular sheaf: a geometric structure which assigns local vector spaces (stalks) and a linear learnable restriction/transport maps to nodes and edges, yielding an edge-aware inductive bias that handles heterophily and limits oversmoothing. However, common Neural Sheaf Diffusion implementations rely on SVD-based sheaf normalization and dense per-edge restriction maps, which scale with stalk dimension, require frequent Laplacian rebuilds, and yield brittle gradients. To address these limitations, we introduce Polynomial Neural Sheaf Diffusion (PolyNSD), a new sheaf diffusion approach whose propagation operator is a degree-K polynomial in a normalised sheaf Laplacian, evaluated via a stable three-term recurrence on a spectrally rescaled operator. This provides an explicit K-hop receptive field in a single layer (independently of the stalk dimension), with a trainable spectral response obtained as a convex mixture of K+1 orthogonal polynomial basis responses. PolyNSD enforces stability via convex mixtures, spectral rescaling, and residual/gated paths, reaching new state-of-the-art results on both homophilic and heterophilic benchmarks, inverting the Neural Sheaf Diffusion trend by obtaining these results with just diagonal restriction maps, decoupling performance from large stalk dimension, while reducing runtime and memory requirements.
Problem

Research questions and friction points this paper is trying to address.

Addresses scalability and stability issues in Neural Sheaf Diffusion
Introduces polynomial-based spectral filtering for K-hop receptive fields
Improves efficiency with diagonal restriction maps and reduced computational costs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Polynomial sheaf Laplacian for stable spectral filtering
Explicit K-hop receptive field with convex mixture responses
Diagonal restriction maps reducing runtime and memory usage
🔎 Similar Papers