TCLNet: A Hybrid Transformer-CNN Framework Leveraging Language Models as Lossless Compressors for CSI Feedback

📅 2026-01-10
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high channel state information (CSI) feedback overhead in FDD massive MIMO systems and the limited compression efficiency of existing methods, which struggle to jointly capture local and global CSI features. To overcome this, the authors propose TCLNet, a novel framework that, for the first time, leverages a large language model as a zero-shot lossless compressor and integrates it with a Transformer-CNN hybrid architecture for lossy compression, enabling collaborative multi-scale feature extraction from CSI. Furthermore, TCLNet fuses the language model with a factorized model, employing context-awareness and parallel encoding to adaptively optimize the rate-distortion-complexity trade-off. Experimental results on both real-world and simulated datasets demonstrate that the proposed method significantly outperforms state-of-the-art approaches, achieving substantial gains in reconstruction accuracy and transmission efficiency, with up to a 5 dB performance improvement.

Technology Category

Application Category

📝 Abstract
In frequency division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems, downlink channel state information (CSI) plays a crucial role in achieving high spectrum and energy efficiency. However, the CSI feedback overhead becomes a major bottleneck as the number of antennas increases. Although existing deep learning-based CSI compression methods have shown great potential, they still face limitations in capturing both local and global features of CSI, thereby limiting achievable compression efficiency. To address these issues, we propose TCLNet, a unified CSI compression framework that integrates a hybrid Transformer-CNN architecture for lossy compression with a hybrid language model (LM) and factorized model (FM) design for lossless compression. The lossy module jointly exploits local features and global context, while the lossless module adaptively switches between context-aware coding and parallel coding to optimize the rate-distortion-complexity (RDC) trade-off. Extensive experiments on both real-world and simulated datasets demonstrate that the proposed TCLNet outperforms existing approaches in terms of reconstruction accuracy and transmission efficiency, achieving up to a 5 dB performance gain across diverse scenarios. Moreover, we show that large language models (LLMs) can be leveraged as zero-shot CSI lossless compressors via carefully designed prompts.
Problem

Research questions and friction points this paper is trying to address.

CSI feedback
compression efficiency
massive MIMO
FDD
channel state information
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer-CNN hybrid
language model for compression
CSI feedback
lossless compression
zero-shot LLM
🔎 Similar Papers
No similar papers found.