Generalizing GNNs with Tokenized Mixture of Experts

📅 2026-02-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Static graph neural networks struggle to simultaneously achieve strong performance on clean data, generalization, and stability under distribution shifts and perturbations. This work proposes STEM-GNN, a framework based on a pretrain-finetune paradigm that integrates a mixture-of-experts encoder, a vector-quantized token interface, and a Lipschitz-regularized prediction head to enable adaptive computation and stable outputs. The method innovatively decouples instance-conditioned routing from stability, leveraging the tokenized interface to suppress routing fluctuations. A theoretical analysis reveals a fundamental trade-off between coverage/selectivity and sensitivity/amplification of perturbations. Evaluated across nine node-, link-, and graph-level benchmarks, STEM-GNN demonstrates significantly enhanced robustness under diverse perturbations—including degree/homophily shifts and feature/edge corruptions—while maintaining competitive performance on clean data.

Technology Category

Application Category

📝 Abstract
Deployed graph neural networks (GNNs) are frozen at deployment yet must fit clean data, generalize under distribution shifts, and remain stable to perturbations. We show that static inference induces a fundamental tradeoff: improving stability requires reducing reliance on shift-sensitive features, leaving an irreducible worst-case generalization floor. Instance-conditional routing can break this ceiling, but is fragile because shifts can mislead routing and perturbations can make routing fluctuate. We capture these effects via two decompositions separating coverage vs selection, and base sensitivity vs fluctuation amplification. Based on these insights, we propose STEM-GNN, a pretrain-then-finetune framework with a mixture-of-experts encoder for diverse computation paths, a vector-quantized token interface to stabilize encoder-to-head signals, and a Lipschitz-regularized head to bound output amplification. Across nine node, link, and graph benchmarks, STEM-GNN achieves a stronger three-way balance, improving robustness to degree/homophily shifts and to feature/edge corruptions while remaining competitive on clean graphs.
Problem

Research questions and friction points this paper is trying to address.

Graph Neural Networks
Distribution Shift
Robustness
Stability
Generalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixture of Experts
Vector Quantization
Lipschitz Regularization
Graph Neural Networks
Distribution Shift Robustness
🔎 Similar Papers
No similar papers found.