🤖 AI Summary
Radar data representation in autonomous driving lacks standardized evaluation, suffering from weak cross-representation comparability and fragmented benchmarking. Method: This work systematically surveys five mainstream radar representations—ADC signals, radar tensors, point clouds, occupancy grids, and micro-Doppler features—unifying their generation principles, application domains, dataset support, modeling paradigms, and performance limits. It identifies key bottlenecks and future directions for robust all-weather perception. Contribution/Results: We introduce an open-source, interactive online platform enabling cross-modal retrieval, visualization, and performance comparison across representations—significantly enhancing reproducibility and standardization. The study establishes a full-stack radar representation framework spanning signal processing to deep learning, providing both theoretical foundations and practical tools for multimodal fusion and adverse-weather perception.
📝 Abstract
With the rapid advancements of sensor technology and deep learning, autonomous driving systems are providing safe and efficient access to intelligent vehicles as well as intelligent transportation. Among these equipped sensors, the radar sensor plays a crucial role in providing robust perception information in diverse environmental conditions. This review focuses on exploring different radar data representations utilized in autonomous driving systems. Firstly, we introduce the capabilities and limitations of the radar sensor by examining the working principles of radar perception and signal processing of radar measurements. Then, we delve into the generation process of five radar representations, including the ADC signal, radar tensor, point cloud, grid map, and micro-Doppler signature. For each radar representation, we examine the related datasets, methods, advantages and limitations. Furthermore, we discuss the challenges faced in these data representations and propose potential research directions. Above all, this comprehensive review offers an in-depth insight into how these representations enhance autonomous system capabilities, providing guidance for radar perception researchers. To facilitate retrieval and comparison of different data representations, datasets and methods, we provide an interactive website at https://radar-camera-fusion.github.io/radar.