GIST: Gauge-Invariant Spectral Transformers for Scalable Graph Neural Operators

πŸ“… 2026-03-17
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the failure of graph neural operators to generalize across meshes, which stems from the high computational complexity of spectral methods and their violation of gauge invariance. To overcome these limitations, the authors propose a novel graph Transformer architecture that integrates random projection with inner-product attention, rigorously preserving gauge invariance in the embedding space. The method achieves linear computational complexity O(N)β€”the first of its kindβ€”with theoretical guarantees for discretization-invariant learning and bounded error. The model enables seamless parameter transfer across arbitrary meshes, attaining a micro F1-score of 99.50% on the PPI dataset and setting a new state-of-the-art performance on the DrivAerNet aerodynamic prediction task involving 750,000 nodes.

Technology Category

Application Category

πŸ“ Abstract
Adapting transformer positional encoding to meshes and graph-structured data presents significant computational challenges: exact spectral methods require cubic-complexity eigendecomposition and can inadvertently break gauge invariance through numerical solver artifacts, while efficient approximate methods sacrifice gauge symmetry by design. Both failure modes cause catastrophic generalization in inductive learning, where models trained with one set of numerical choices fail when encountering different spectral decompositions of similar graphs or discretizations of the same mesh. We propose GIST (Gauge-Invariant Spectral Transformers), a new graph transformer architecture that resolves this challenge by achieving end-to-end $\mathcal{O}(N)$ complexity through random projections while algorithmically preserving gauge invariance via inner-product-based attention on the projected embeddings. We prove GIST achieves discretization-invariant learning with bounded mismatch error, enabling parameter transfer across arbitrary mesh resolutions for neural operator applications. Empirically, GIST matches state-of-the-art on standard graph benchmarks (e.g., achieving 99.50% micro-F1 on PPI) while uniquely scaling to mesh-based Neural Operator benchmarks with up to 750K nodes, achieving state-of-the-art aerodynamic prediction on the challenging DrivAerNet and DrivAerNet++ datasets.
Problem

Research questions and friction points this paper is trying to address.

gauge invariance
spectral methods
graph neural operators
mesh discretization
inductive generalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gauge Invariance
Spectral Transformers
Graph Neural Operators
Random Projections
Discretization-Invariant Learning
πŸ”Ž Similar Papers
No similar papers found.