🤖 AI Summary
To address the high computational complexity, energy consumption, and latency bottlenecks in channel estimation for RIS-aided XL-MIMO systems in 6G edge deployments, this paper proposes a lightweight deep learning framework for efficient cascaded channel estimation. The method introduces a spatial-correlation-aware block-wise training strategy that projects the original high-dimensional channel into compact block-level representations, preserving essential structural information while drastically reducing model parameters and inference cost. The network employs a lightweight DNN architecture that jointly models spatial correlation and processes block-structured inputs, ensuring compatibility with edge-device constraints on power efficiency and real-time processing. Simulation results demonstrate that the proposed approach achieves 30–50% lower normalized mean square error (NMSE) across diverse antenna/RIS configurations, reduces computational complexity by one to two orders of magnitude, and exhibits strong scalability and practical deployment potential.
📝 Abstract
Next-generation wireless technologies such as 6G aim to meet demanding requirements such as ultra-high data rates, low latency, and enhanced connectivity. Extremely Large-Scale MIMO (XL-MIMO) and Reconfigurable Intelligent Surface (RIS) are key enablers, with XL-MIMO boosting spectral and energy efficiency through numerous antennas, and RIS offering dynamic control over the wireless environment via passive reflective elements. However, realizing their full potential depends on accurate Channel State Information (CSI). Recent advances in deep learning have facilitated efficient cascaded channel estimation. However, the scalability and practical deployment of existing estimation models in XL-MIMO systems remain limited. The growing number of antennas and RIS elements introduces a significant barrier to real-time and efficient channel estimation, drastically increasing data volume, escalating computational complexity, requiring advanced hardware, and resulting in substantial energy consumption. To address these challenges, we propose a lightweight deep learning framework for efficient cascaded channel estimation in XL-MIMO systems, designed to minimize computational complexity and make it suitable for deployment on resource-constrained edge devices. Using spatial correlations in the channel, we introduce a patch-based training mechanism that reduces the dimensionality of input to patch-level representations while preserving essential information, allowing scalable training for large-scale systems. Simulation results under diverse conditions demonstrate that our framework significantly improves estimation accuracy and reduces computational complexity, regardless of the increasing number of antennas and RIS elements in XL-MIMO systems.