🤖 AI Summary
AI models in wireless communications have long been constrained by task-specificity, poor generalization, limited semantic understanding, and inadequate human–machine interaction capabilities. Method: This paper proposes the first physical-layer wireless foundation model (WPFM) paradigm. It introduces a temporal-aware signal embedding mechanism, a unified architecture supporting self-supervised pretraining and multimodal (time-series signals + text) joint modeling, and establishes a strategic WPFM framework that formalizes heterogeneous signal representation, pretraining task design, and large language model (LLM) integration pathways. Contribution/Results: The WPFM breaks the task-specific AI bottleneck, enabling unified understanding and semantic description of diverse wireless signals. It is the first physical-layer model to provide human-interpretable, natural-language interaction interfaces. It achieves fundamental advances in signal understanding accuracy, cross-scenario generalization, and human–machine collaborative capability.
📝 Abstract
Artificial intelligence (AI) plays an important role in the dynamic landscape of wireless communications, solving challenges unattainable by traditional approaches. This paper discusses the evolution of wireless AI, emphasizing the transition from isolated task-specific models to more generalizable and adaptable AI models inspired by recent successes in large language models (LLMs) and computer vision. To overcome task-specific AI strategies in wireless networks, we propose a unified wireless physical-layer foundation model (WPFM). Challenges include the design of effective pre-training tasks, support for embedding heterogeneous time series and human-understandable interaction. The paper presents a strategic framework, focusing on embedding wireless time series, self-supervised pre-training, and semantic representation learning. The proposed WPFM aims to understand and describe diverse wireless signals, allowing human interactivity with wireless networks. The paper concludes by outlining next research steps for WPFMs, including the integration with LLMs.