CoVeRaP: Cooperative Vehicular Perception through mmWave FMCW Radars

πŸ“… 2025-08-21
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the low 3D detection accuracy caused by sparse and noisy FMCW radar point clouds under rainy or high-illumination conditions, this paper introduces CoVeRaPβ€”the first reproducible multi-vehicle collaborative perception dataset comprising 21,000 frames, featuring high-precision temporal synchronization across millimeter-wave radar, cameras, and GPS. We propose a unified framework integrating mid-level and late fusion, pioneering radar intensity encoding and a Doppler-spatial-intensity three-channel PointNet++ encoder enhanced with self-attention, jointly predicting 3D bounding boxes and point-wise depth confidence. Experiments show that, at IoU=0.9, mid-level fusion improves mAP by up to 9Γ—; compared to single-vehicle baselines, our method significantly enhances detection robustness in adverse weather. The dataset and code are publicly released.

Technology Category

Application Category

πŸ“ Abstract
Automotive FMCW radars remain reliable in rain and glare, yet their sparse, noisy point clouds constrain 3-D object detection. We therefore release CoVeRaP, a 21 k-frame cooperative dataset that time-aligns radar, camera, and GPS streams from multiple vehicles across diverse manoeuvres. Built on this data, we propose a unified cooperative-perception framework with middle- and late-fusion options. Its baseline network employs a multi-branch PointNet-style encoder enhanced with self-attention to fuse spatial, Doppler, and intensity cues into a common latent space, which a decoder converts into 3-D bounding boxes and per-point depth confidence. Experiments show that middle fusion with intensity encoding boosts mean Average Precision by up to 9x at IoU 0.9 and consistently outperforms single-vehicle baselines. CoVeRaP thus establishes the first reproducible benchmark for multi-vehicle FMCW-radar perception and demonstrates that affordable radar sharing markedly improves detection robustness. Dataset and code are publicly available to encourage further research.
Problem

Research questions and friction points this paper is trying to address.

Improving 3D object detection with sparse noisy radar data
Enabling multi-vehicle cooperative perception through radar fusion
Addressing environmental challenges like rain and glare in automotive sensing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cooperative multi-vehicle dataset with aligned sensors
Unified fusion framework with attention-enhanced PointNet encoder
Radar sharing boosts detection precision ninefold
πŸ”Ž Similar Papers
No similar papers found.