MMA: A Momentum Mamba Architecture for Human Activity Recognition with Inertial Sensors

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing deep models (CNNs, RNNs, Transformers) for inertial-sensor-based human activity recognition (HAR) suffer from gradient instability, high computational overhead, and insufficient modeling of long-range temporal dependencies. To address these issues, we propose Complex Momentum Mamba (CMM), a momentum-enhanced structured state space model. CMM is the first to incorporate a second-order momentum mechanism into the Mamba architecture, enabling stable long-term memory retention. Furthermore, it introduces a complex-domain frequency-selective memory scaling mechanism to achieve frequency-aware memory modulation. Retaining linear-time complexity, CMM significantly accelerates convergence and enhances robustness in long-sequence modeling. On multiple HAR benchmarks, CMM consistently outperforms standard Mamba and Transformer baselines in accuracy, robustness, and training efficiency—achieving optimal performance-complexity trade-offs under moderate computational cost.

Technology Category

Application Category

📝 Abstract
Human activity recognition (HAR) from inertial sensors is essential for ubiquitous computing, mobile health, and ambient intelligence. Conventional deep models such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and transformers have advanced HAR but remain limited by vanishing or exloding gradients, high computational cost, and difficulty in capturing long-range dependencies. Structured state-space models (SSMs) like Mamba address these challenges with linear complexity and effective temporal modeling, yet they are restricted to first-order dynamics without stable longterm memory mechanisms. We introduce Momentum Mamba, a momentum-augmented SSM that incorporates second-order dynamics to improve stability of information flow across time steps, robustness, and long-sequence modeling. Two extensions further expand its capacity: Complex Momentum Mamba for frequency-selective memory scaling. Experiments on multiple HAR benchmarks demonstrate consistent gains over vanilla Mamba and Transformer baselines in accuracy, robustness, and convergence speed. With only moderate increases in training cost, momentum-augmented SSMs offer a favorable accuracy-efficiency balance, establishing them as a scalable paradigm for HAR and a promising principal framework for broader sequence modeling applications.
Problem

Research questions and friction points this paper is trying to address.

Improving human activity recognition using inertial sensors with enhanced temporal modeling
Addressing vanishing gradients and long-range dependency issues in sequence models
Developing efficient architectures with better accuracy-efficiency balance for HAR
Innovation

Methods, ideas, or system contributions that make the work stand out.

Momentum-augmented SSM with second-order dynamics
Complex Momentum Mamba for frequency-selective scaling
Linear complexity with improved long-sequence modeling
🔎 Similar Papers
No similar papers found.
T
Thai-Khanh Nguyen
Faculty of Information Technology, Dainam University, Hanoi University of Science and Technology, Hanoi 10000, Vietnam
U
Uyen Vo
Faculty of Artificial Intelligence, Posts and Telecommunications Institute of Technology, Hanoi 10000, Vietnam
T
Tan M. Nguyen
Departments of Mathematics, National University of Singapore, Singapore 119076, Singapore
Thieu N. Vo
Thieu N. Vo
Ton Duc Thang University, Ho Chi Minh City, Vietnam
Computer AlgebraSymbolic-Numeric Computation
T
Trung-Hieu Le
Faculty of Information Technology, Dainam University, Hanoi University of Science and Technology, Hanoi 10000, Vietnam
C
Cuong Pham
Faculty of Artificial Intelligence, Posts and Telecommunications Institute of Technology, Hanoi 10000, Vietnam