Exploring Radar Data Representations in Autonomous Driving: A Comprehensive Review

📅 2023-12-08
🏛️ IEEE transactions on intelligent transportation systems (Print)
📈 Citations: 5
Influential: 0
📄 PDF
🤖 AI Summary
Radar data representation in autonomous driving lacks standardized evaluation, suffering from weak cross-representation comparability and fragmented benchmarking. Method: This work systematically surveys five mainstream radar representations—ADC signals, radar tensors, point clouds, occupancy grids, and micro-Doppler features—unifying their generation principles, application domains, dataset support, modeling paradigms, and performance limits. It identifies key bottlenecks and future directions for robust all-weather perception. Contribution/Results: We introduce an open-source, interactive online platform enabling cross-modal retrieval, visualization, and performance comparison across representations—significantly enhancing reproducibility and standardization. The study establishes a full-stack radar representation framework spanning signal processing to deep learning, providing both theoretical foundations and practical tools for multimodal fusion and adverse-weather perception.
📝 Abstract
With the rapid advancements of sensor technology and deep learning, autonomous driving systems are providing safe and efficient access to intelligent vehicles as well as intelligent transportation. Among these equipped sensors, the radar sensor plays a crucial role in providing robust perception information in diverse environmental conditions. This review focuses on exploring different radar data representations utilized in autonomous driving systems. Firstly, we introduce the capabilities and limitations of the radar sensor by examining the working principles of radar perception and signal processing of radar measurements. Then, we delve into the generation process of five radar representations, including the ADC signal, radar tensor, point cloud, grid map, and micro-Doppler signature. For each radar representation, we examine the related datasets, methods, advantages and limitations. Furthermore, we discuss the challenges faced in these data representations and propose potential research directions. Above all, this comprehensive review offers an in-depth insight into how these representations enhance autonomous system capabilities, providing guidance for radar perception researchers. To facilitate retrieval and comparison of different data representations, datasets and methods, we provide an interactive website at https://radar-camera-fusion.github.io/radar.
Problem

Research questions and friction points this paper is trying to address.

Reviewing radar data representations in autonomous driving systems
Analyzing capabilities and limitations of radar sensor technology
Exploring challenges and future research directions in radar perception
Innovation

Methods, ideas, or system contributions that make the work stand out.

Review explores five radar data representations
Analyzes datasets, methods, pros, and cons
Proposes research directions for radar perception
🔎 Similar Papers
No similar papers found.
Shanliang Yao
Shanliang Yao
Yancheng Institute of Technology
Autonomous DrivingIntelligent VehiclesRadar-Camera FusionMaritime Perception
Runwei Guan
Runwei Guan
Hong Kong University of Science and Technology (Guangzhou) / Founder of FertiTech AI
Multi-Modal LearningUnmanned Surface VesselRadar PerceptionAI Medicine
Z
Zitian Peng
Faculty of Science and Engineering, University of Liverpool, Liverpool, UK
C
Chenhang Xu
School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou, China
Y
Yilu Shi
School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou, China
E
Eng Gee Lim
School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou, China
H
Hyungjoon Seo
Faculty of Science and Engineering, University of Liverpool, Liverpool, UK
Ka Lok Man
Ka Lok Man
Professor, Xi'an Jiaotong-Liverpool University
Xiaohui Zhu
Xiaohui Zhu
Xi'an Jiaotong-Liverpool University
Autonomous navigationRoboticsAI applicationsEnvironment monitoringAIoT
Y
Yutao Yue
Yong Yue
Yong Yue
Xi'an Jiaotong-liverpool University
CAM/CAMcomputer graphicsrobotics and AI applications