A Graph Meta-Network for Learning on Kolmogorov-Arnold Networks

📅 2026-02-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods struggle to effectively model the weight space of Kolmogorov–Arnold Networks (KANs), limiting their applicability in tasks such as performance prediction. This work reveals for the first time that KANs share permutation symmetries with multilayer perceptrons (MLPs). Leveraging this insight, we introduce a KAN-graph representation to capture the computational structure of KANs and propose WS-KAN, the first symmetry-aware weight-space meta-network tailored for KANs, which accurately reconstructs KAN forward propagation. Evaluated on a multi-task KAN “model zoo” benchmark, WS-KAN substantially outperforms architecture-agnostic baselines, demonstrating its superior expressivity, effectiveness, and generalization capability.

Technology Category

Application Category

📝 Abstract
Weight-space models learn directly from the parameters of neural networks, enabling tasks such as predicting their accuracy on new datasets. Naive methods -- like applying MLPs to flattened parameters -- perform poorly, making the design of better weight-space architectures a central challenge. While prior work leveraged permutation symmetries in standard networks to guide such designs, no analogous analysis or tailored architecture yet exists for Kolmogorov-Arnold Networks (KANs). In this work, we show that KANs share the same permutation symmetries as MLPs, and propose the KAN-graph, a graph representation of their computation. Building on this, we develop WS-KAN, the first weight-space architecture that learns on KANs, which naturally accounts for their symmetry. We analyze WS-KAN's expressive power, showing it can replicate an input KAN's forward pass - a standard approach for assessing expressiveness in weight-space architectures. We construct a comprehensive ``zoo'' of trained KANs spanning diverse tasks, which we use as benchmarks to empirically evaluate WS-KAN. Across all tasks, WS-KAN consistently outperforms structure-agnostic baselines, often by a substantial margin. Our code is available at https://github.com/BarSGuy/KAN-Graph-Metanetwork.
Problem

Research questions and friction points this paper is trying to address.

weight-space models
Kolmogorov-Arnold Networks
permutation symmetries
graph representation
neural network architecture
Innovation

Methods, ideas, or system contributions that make the work stand out.

Kolmogorov-Arnold Networks
weight-space learning
graph representation
permutation symmetry
meta-network
🔎 Similar Papers
No similar papers found.