Metasurfaces-Integrated Wireless Neural Networks for Lightweight Over-The-Air Edge Inference

๐Ÿ“… 2026-02-22
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the stringent requirements of ultra-low latency and high energy efficiency in 6G edge intelligence, which conventional digital hardware struggles to meet due to excessive power consumption. To overcome this limitation, the authors propose MINN, a physical-layer deep learning framework that pioneers the integration of programmable multi-layer metasurfaces into neural network architectures. By leveraging MIMO wireless channels as trainable computational layers directly in the propagation domain, MINN transforms the channel itself into a lightweight, low-power in-air computing unit. Through a hybrid analogโ€“digital design and end-to-end joint training, MINN achieves performance comparable to fully digital deep neural networks on representative tasks while substantially reducing energy consumption, thereby demonstrating its feasibility and advantages for edge inference.

Technology Category

Application Category

๐Ÿ“ Abstract
The upcoming sixth Generation (6G) of wireless networks envisions ultra-low latency and energy efficient Edge Inference (EI) for diverse Internet of Things (IoT) applications. However, traditional digital hardware for machine learning is power intensive, motivating the need for alternative computation paradigms. Over-The-Air (OTA) computation is regarded as an emerging transformative approach assigning the wireless channel to actively perform computational tasks. This article introduces the concept of Metasurfaces-Integrated Neural Networks (MINNs), a physical-layer-enabled deep learning framework that leverages programmable multi-layer metasurface structures and Multiple-Input Multiple-Output (MIMO) channels to realize computational layers in the wave propagation domain. The MINN system is conceptualized as three modules: Encoder, Channel (uncontrollable propagation features and metasurfaces), and Decoder. The first and last modules, realized respectively at the multi-antenna transmitter and receiver, consist of conventional digital or purposely designed analog Deep Neural Network (DNN) layers, and the metasurfaces responses of the Channel module are optimized alongside all modules as trainable weights. This architecture enables computation offloading into the end-to-end physical layer, flexibly among its constituent modules, achieving performance comparable to fully digital DNNs while significantly reducing power consumption. The training of the MINN framework, two representative variations, and performance results for indicative applications are presented, highlighting the potential of MINNs as a lightweight and sustainable solution for future EI-enabled wireless systems. The article is concluded with a list of open challenges and promising research directions.
Problem

Research questions and friction points this paper is trying to address.

Edge Inference
6G
Energy Efficiency
Over-The-Air Computation
IoT
Innovation

Methods, ideas, or system contributions that make the work stand out.

Metasurfaces
Over-The-Air Computation
Edge Inference
Physical-Layer AI
MIMO
๐Ÿ”Ž Similar Papers
No similar papers found.