Graph Signal Processing Meets Mamba2: Adaptive Filter Bank via Delta Modulation

📅 2026-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of the existing Mamba2 architecture, whose multi-head recurrence lacks structured design and frequency-domain modeling capabilities, thereby struggling to balance efficiency and expressiveness. We propose the first integration of graph signal processing (GSP) into state space models, reinterpreting Mamba2 as an adaptive filter bank operating on a line graph. Building upon this insight, we introduce a hierarchical filter architecture based on delta (Δ) modulation: a shared low-pass filter captures global trends, while expert high-pass filters model local details. This approach enables parameter-efficient, interpretable sequence modeling with explicit spectral characteristics. Empirical results demonstrate that our method matches Mamba2’s performance on language modeling, commonsense reasoning, and long-context retrieval tasks while using only 58.9% of its parameters.

Technology Category

Application Category

📝 Abstract
State-space models (SSMs) offer efficient alternatives to attention with linear-time recurrence. Mamba2, a recent SSM-based language model, uses selective input gating and a multi-head structure, enabling parallel computation and strong benchmark performance. However, its multi-head recurrence operates independently without structured utilization or analysis. In this work, we propose a novel method called Hierarchical ADaptive filter bank for Efficient SSMs (HADES), a Graph Signal Processing (GSP)-inspired framework that reinterprets Mamba2 as an adaptive filter bank on a line graph. Our hierarchical architecture introduces two filter types: shared filters for global low-pass behavior and expert filters for local high-pass behavior, achieved through structured bias on the parameter Δ. HADES achieves comparable performance to baseline models including Mamba2 across various benchmarks in language modeling, commonsense reasoning, and long-context retrieval, while using only 58.9% of the original parameters. In this regard, HADES bridges GSP and neural sequence modeling, enabling efficient, hierarchical, and interpretable filtering within state-space models.
Problem

Research questions and friction points this paper is trying to address.

State-space models
Graph Signal Processing
Adaptive filter bank
Mamba2
Efficient neural modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph Signal Processing
State-Space Models
Adaptive Filter Bank
Mamba2
Parameter-Efficient Modeling
🔎 Similar Papers
No similar papers found.