Keyword Mamba: Spoken Keyword Spotting with State Space Models

📅 2025-08-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of jointly modeling long-range temporal dependencies and ensuring computational efficiency in keyword spotting (KWS), this paper pioneers the integration of the state-space model Mamba into KWS, proposing a lightweight end-to-end architecture. Departing from the computationally intensive self-attention mechanism of Transformers, our approach leverages Mamba’s selective state-space modeling to efficiently capture long-term temporal dynamics along the sequence axis. The model is trained end-to-end on the Google Speech Commands dataset. Experimental results demonstrate that our method achieves state-of-the-art accuracy (98.2%) while reducing model parameters by 47% and FLOPs by 63% compared to leading CNN-, RNN-, and Transformer-based baselines. This work validates the efficacy and deployment advantages of state-space models for low-latency, resource-constrained KWS applications, establishing a novel paradigm for efficient sequential modeling in speech processing.

Technology Category

Application Category

📝 Abstract
Keyword spotting (KWS) is an essential task in speech processing. It is widely used in voice assistants and smart devices. Deep learning models like CNNs, RNNs, and Transformers have performed well in KWS. However, they often struggle to handle long-term patterns and stay efficient at the same time. In this work, we present Keyword Mamba, a new architecture for KWS. It uses a neural state space model (SSM) called Mamba. We apply Mamba along the time axis and also explore how it can replace the self-attention part in Transformer models. We test our model on the Google Speech Commands datasets. The results show that Keyword Mamba reaches strong accuracy with fewer parameters and lower computational cost. To our knowledge, this is the first time a state space model has been used for KWS. These results suggest that Mamba has strong potential in speech-related tasks.
Problem

Research questions and friction points this paper is trying to address.

Improving keyword spotting efficiency and accuracy
Handling long-term patterns in speech processing
Reducing computational cost in deep learning models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses neural state space model Mamba
Replaces Transformer self-attention with Mamba
Achieves high accuracy with low cost
🔎 Similar Papers
No similar papers found.
H
Hanyu Ding
School of Computer Science and Communication Engineering, Jiangsu University, Zhenjiang, 212013, China
Wenlong Dong
Wenlong Dong
Southern University of Science and Technology
Robotics、Perception
Qirong Mao
Qirong Mao
Jiangsu University
AIMultimedia