MambaMoE: Mixture-of-Spectral-Spatial-Experts State Space Model for Hyperspectral Image Classification

📅 2025-04-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Mamba-based models for hyperspectral image (HSI) classification overlook spectral-spatial directional heterogeneity, limiting discriminative capability in complex heterogeneous regions. Method: This paper proposes the first hyperspectral-specific Mixture-of-Experts (MoE) spectral-spatial adaptive modeling framework. It introduces (1) a sparsely activated Spectral-Spatial Mixture-of-Experts Block (MoMEB) to explicitly model spectral and spatial features with directional specialization; (2) an Uncertainty-Guided Correction Learning (UGCL) strategy to enhance robustness against heterogeneous regions; and (3) a hybrid architecture integrating state-space modeling with MoE to jointly capture long-range dependencies and maintain computational efficiency. Results: Evaluated on multiple benchmark HSI datasets, the method achieves state-of-the-art classification accuracy and inference efficiency with significantly fewer parameters than mainstream Mamba- and Transformer-based baselines, demonstrating superior generalization and scalability.

Technology Category

Application Category

📝 Abstract
The Mamba model has recently demonstrated strong potential in hyperspectral image (HSI) classification, owing to its ability to perform context modeling with linear computational complexity. However, existing Mamba-based methods usually neglect the spectral and spatial directional characteristics related to heterogeneous objects in hyperspectral scenes, leading to limited classification performance. To address these issues, we propose MambaMoE, a novel spectral-spatial mixture-of-experts framework, representing the first MoE-based approach in the HSI classification community. Specifically, we design a Mixture of Mamba Expert Block (MoMEB) that leverages sparse expert activation to enable adaptive spectral-spatial modeling. Furthermore, we introduce an uncertainty-guided corrective learning (UGCL) strategy to encourage the model's attention toward complex regions prone to prediction ambiguity. Extensive experiments on multiple public HSI benchmarks demonstrate that MambaMoE achieves state-of-the-art performance in both accuracy and efficiency compared to existing advanced approaches, especially for Mamba-based methods. Code will be released.
Problem

Research questions and friction points this paper is trying to address.

Enhancing HSI classification by addressing spectral-spatial directional neglect
Introducing first MoE-based framework for adaptive spectral-spatial modeling
Improving ambiguous region prediction via uncertainty-guided corrective learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral-spatial mixture-of-experts framework for HSI
Mixture of Mamba Expert Block for adaptive modeling
Uncertainty-guided corrective learning for complex regions
🔎 Similar Papers
No similar papers found.