Sinusoidal Approximation Theorem for Kolmogorov-Arnold Networks

📅 2025-07-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited expressive power of Kolmogorov–Arnold Networks (KANs), this paper proposes SinKAN—a novel KAN architecture employing learnable-frequency sinusoidal activation functions. Methodologically, SinKAN replaces both inner and outer spline activations in standard KANs with weighted sinusoidal basis functions, where frequencies are trainable parameters and phases are fixed to uniformly spaced constants; it further incorporates the Lorentz–Sprecher simplification framework to preserve theoretical soundness. The key contribution is the first systematic integration of learnable frequency parameters into KANs, substantially enhancing approximation capability for high-frequency and nonsmooth multivariate functions. Experiments demonstrate that SinKAN outperforms fixed-frequency Fourier networks across multiple benchmark function approximation tasks, achieves performance on par with multilayer perceptrons (MLPs), and retains KANs’ intrinsic advantages—namely, interpretability and parameter sparsity.

Technology Category

Application Category

📝 Abstract
The Kolmogorov-Arnold representation theorem states that any continuous multivariable function can be exactly represented as a finite superposition of continuous single variable functions. Subsequent simplifications of this representation involve expressing these functions as parameterized sums of a smaller number of unique monotonic functions. These developments led to the proof of the universal approximation capabilities of multilayer perceptron networks with sigmoidal activations, forming the alternative theoretical direction of most modern neural networks. Kolmogorov-Arnold Networks (KANs) have been recently proposed as an alternative to multilayer perceptrons. KANs feature learnable nonlinear activations applied directly to input values, modeled as weighted sums of basis spline functions. This approach replaces the linear transformations and sigmoidal post-activations used in traditional perceptrons. Subsequent works have explored alternatives to spline-based activations. In this work, we propose a novel KAN variant by replacing both the inner and outer functions in the Kolmogorov-Arnold representation with weighted sinusoidal functions of learnable frequencies. Inspired by simplifications introduced by Lorentz and Sprecher, we fix the phases of the sinusoidal activations to linearly spaced constant values and provide a proof of its theoretical validity. We also conduct numerical experiments to evaluate its performance on a range of multivariable functions, comparing it with fixed-frequency Fourier transform methods and multilayer perceptrons (MLPs). We show that it outperforms the fixed-frequency Fourier transform and achieves comparable performance to MLPs.
Problem

Research questions and friction points this paper is trying to address.

Proposes sinusoidal-based Kolmogorov-Arnold Networks (KANs) for multivariable function approximation
Replaces spline activations with learnable-frequency sinusoidal functions
Compares performance with Fourier transforms and multilayer perceptrons (MLPs)
Innovation

Methods, ideas, or system contributions that make the work stand out.

Replaces spline activations with sinusoidal functions
Uses learnable frequencies for sinusoidal activations
Fixes phases to linearly spaced constant values