Variational Kolmogorov-Arnold Network

📅 2025-07-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In Kolmogorov–Arnold networks (KANs), the number of basis functions for univariate splines must be manually specified, limiting representational capacity and flexibility. Method: We propose InfinityKAN—the first KAN variant to integrate variational inference into the architecture, enabling end-to-end, differentiable learning of the optimal number of basis functions by treating it as a trainable stochastic hyperparameter. Grounded in the Kolmogorov–Arnold representation theorem, InfinityKAN establishes a theoretically sound framework for infinite-basis function approximation. Contribution/Results: Experiments demonstrate that InfinityKAN significantly outperforms standard KANs and MLPs on function approximation tasks, achieving faster convergence, higher accuracy, and superior expressivity, adaptability, and generalization—without manual basis selection or architectural reconfiguration.

Technology Category

Application Category

📝 Abstract
Kolmogorov Arnold Networks (KANs) are an emerging architecture for building machine learning models. KANs are based on the theoretical foundation of the Kolmogorov-Arnold Theorem and its expansions, which provide an exact representation of a multi-variate continuous bounded function as the composition of a limited number of univariate continuous functions. While such theoretical results are powerful, their use as a representation learning alternative to a multi-layer perceptron (MLP) hinges on the ad-hoc choice of the number of bases modeling each of the univariate functions. In this work, we show how to address this problem by adaptively learning a potentially infinite number of bases for each univariate function during training. We therefore model the problem as a variational inference optimization problem. Our proposal, called InfinityKAN, which uses backpropagation, extends the potential applicability of KANs by treating an important hyperparameter as part of the learning process.
Problem

Research questions and friction points this paper is trying to address.

Adaptively learning infinite bases for univariate functions
Replacing ad-hoc basis choices in Kolmogorov-Arnold Networks
Treating hyperparameters as part of variational learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptively learns infinite bases during training
Uses variational inference optimization approach
Treats hyperparameter as part of learning process
🔎 Similar Papers
No similar papers found.