"KAN you hear me?"Exploring Kolmogorov-Arnold Networks for Spoken Language Understanding

📅 2025-05-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Kolmogorov–Arnold Networks (KANs) remain underexplored for end-to-end spoken language understanding (SLU), particularly in speech processing. Method: We systematically integrate KAN layers—replacing conventional linear layers—into both CNN- and Transformer-based SLU architectures, propose a multi-configuration KAN ensemble, and introduce waveform-level attention analysis to uncover how KANs differentially model salient time-frequency regions in raw speech. Contribution/Results: Evaluated across five SLU benchmarks of increasing complexity, KAN-enhanced models match or exceed baseline performance while demonstrating superior generalization and interpretability. This work establishes KANs as a principled, effective replacement for linear layers in speech modeling, introducing a new paradigm for interpretable, non-linear parameterization in SLU.

Technology Category

Application Category

📝 Abstract
Kolmogorov-Arnold Networks (KANs) have recently emerged as a promising alternative to traditional neural architectures, yet their application to speech processing remains under explored. This work presents the first investigation of KANs for Spoken Language Understanding (SLU) tasks. We experiment with 2D-CNN models on two datasets, integrating KAN layers in five different configurations within the dense block. The best-performing setup, which places a KAN layer between two linear layers, is directly applied to transformer-based models and evaluated on five SLU datasets with increasing complexity. Our results show that KAN layers can effectively replace the linear layers, achieving comparable or superior performance in most cases. Finally, we provide insights into how KAN and linear layers on top of transformers differently attend to input regions of the raw waveforms.
Problem

Research questions and friction points this paper is trying to address.

Exploring KANs for Spoken Language Understanding tasks
Comparing KAN layers with linear layers in SLU models
Analyzing KAN layer attention patterns in transformer models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrating KAN layers in CNN models
Applying KAN layers in transformer models
Replacing linear layers with KAN layers