🤖 AI Summary
This work addresses the challenges of high computational complexity and difficulty in modeling long-range dependencies in channel estimation for OFDM systems with a large number of subcarriers. To this end, the authors propose a neural network architecture that integrates self-attention mechanisms with a customized Mamba framework. By incorporating a bidirectional selective scanning mechanism, the model effectively captures non-causal, long-range dependencies among subcarriers. The proposed approach achieves superior channel estimation performance compared to existing Transformer-based and baseline neural networks, while significantly reducing both model parameter count and spatial complexity. Its efficacy and accuracy are validated under the 3GPP TS 36.101 channel model, demonstrating its potential for practical deployment in large-scale OFDM systems.
📝 Abstract
This paper proposes a Mamba-assisted neural network framework incorporating self-attention mechanism to achieve improved channel estimation with low complexity for orthogonal frequency-division multiplexing (OFDM) waveforms, particularly for configurations with a large number of subcarriers. With the integration of customized Mamba architecture, the proposed framework handles large-scale subcarrier channel estimation efficiently while capturing long-distance dependencies among these subcarriers effectively. Unlike conventional Mamba structure, this paper implements a bidirectional selective scan to improve channel estimation performance, because channel gains at different subcarriers are non-causal. Moreover, the proposed framework exhibits relatively lower space complexity than transformer-based neural networks. Simulation results tested on the 3GPP TS 36.101 channel demonstrate that compared to other baseline neural network solutions, the proposed method achieves improved channel estimation performance with a reduced number of tunable parameters.