Bridging Visual and Wireless Sensing: A Unified Radiation Field for 3D Radio Map Construction

📅 2026-01-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing approaches treat optical and wireless sensing in isolation, hindering the construction of high-fidelity 3D radio maps essential for ambient intelligence. This work proposes URF-GS, a novel framework that unifies the radiance field representations of light and electromagnetic waves for the first time. By fusing visual and wireless observations, URF-GS jointly reconstructs scene geometry and material properties through 3D Gaussian splatting and inverse rendering, while simultaneously predicting wireless signal propagation under arbitrary transmitter-receiver configurations. The method substantially enhances both the accuracy and generalization capability of 3D radio mapping, achieving up to a 24.7% improvement in spatial spectrum prediction accuracy over NeRF and demonstrating a tenfold increase in sample efficiency.

Technology Category

Application Category

📝 Abstract
The emerging applications of next-generation wireless networks (e.g., immersive 3D communication, low-altitude networks, and integrated sensing and communication) necessitate high-fidelity environmental intelligence. 3D radio maps have emerged as a critical tool for this purpose, enabling spectrum-aware planning and environment-aware sensing by bridging the gap between physical environments and electromagnetic signal propagation. However, constructing accurate 3D radio maps requires fine-grained 3D geometric information and a profound understanding of electromagnetic wave propagation. Existing approaches typically treat optical and wireless knowledge as distinct modalities, failing to exploit the fundamental physical principles governing both light and electromagnetic propagation. To bridge this gap, we propose URF-GS, a unified radio-optical radiation field representation framework for accurate and generalizable 3D radio map construction based on 3D Gaussian splatting (3D-GS) and inverse rendering. By fusing visual and wireless sensing observations, URF-GS recovers scene geometry and material properties while accurately predicting radio signal behavior at arbitrary transmitter-receiver (Tx-Rx) configurations. Experimental results demonstrate that URF-GS achieves up to a 24.7% improvement in spatial spectrum prediction accuracy and a 10x increase in sample efficiency for 3D radio map construction compared with neural radiance field (NeRF)-based methods. This work establishes a foundation for next-generation wireless networks by integrating perception, interaction, and communication through holistic radiation field reconstruction.
Problem

Research questions and friction points this paper is trying to address.

3D radio map
visual and wireless sensing
radiation field
electromagnetic propagation
environmental intelligence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified Radiation Field
3D Radio Map
3D Gaussian Splatting
Inverse Rendering
Radio-Optical Fusion
🔎 Similar Papers
No similar papers found.
Chaozheng Wen
Chaozheng Wen
PhD Student, iComAI Lab, Hong Kong University of Science and Technology (HKUST)
Wireless CommunicationComupter Vision
J
Jingwen Tong
Department of Electronic and Computer Engineering (ECE), The Hong Kong University of Science and Technology (HKUST), Clear Water Bay, Kowloon, Hong Kong, China.
Zehong Lin
Zehong Lin
Research Assistant Professor, Hong Kong University of Science and Technology
Edge AIMachine Learning
C
Chenghong Bian
Department of Electronic and Computer Engineering (ECE), The Hong Kong University of Science and Technology (HKUST), Clear Water Bay, Kowloon, Hong Kong, China.
Jun Zhang
Jun Zhang
Professor, Hong Kong University of Science and Technology, IEEE Fellow
Mobile Edge ComputingEdge AIWireless CommunicationsGenAI