🤖 AI Summary
To address the degradation of 3D object detection performance under adverse weather due to LiDAR point cloud quality deterioration, this paper proposes a tightly integrated LiDAR–weather-robust 4D radar fusion framework. We pioneer early-stage fusion of these complementary modalities and design the IM² parallel backbone network to jointly extract cross-modal and intra-modal features. Furthermore, we introduce a multi-scale gated fusion (MSGF) module and a foreground-aware denoising (FAD) module to adaptively mitigate the asymmetric degradation of heterogeneous sensors in rain and fog. Our method achieves a +20.0% improvement in 3D mean Average Precision (mAP) on the VoD fog-simulated dataset and significantly outperforms LiDAR-only baselines on the real-world K-Radar adverse-weather benchmark. These results demonstrate superior robustness and generalization capability across diverse weather conditions.
📝 Abstract
LiDAR-based vision systems are integral for 3D object detection, which is crucial for autonomous navigation. However, they suffer from performance degradation in adverse weather conditions due to the quality deterioration of LiDAR point clouds. Fusing LiDAR with the weather-robust 4D radar sensor is expected to solve this problem. However, the fusion of LiDAR and 4D radar is challenging because they differ significantly in terms of data quality and the degree of degradation in adverse weather. To address these issues, we introduce L4DR, a weather-robust 3D object detection method that effectively achieves LiDAR and 4D Radar fusion. Our L4DR includes Multi-Modal Encoding (MME) and Foreground-Aware Denoising (FAD) technique to reconcile sensor gaps, which is the first exploration of the complementarity of early fusion between LiDAR and 4D radar. Additionally, we design an Inter-Modal and Intra-Modal ({IM}2 ) parallel feature extraction backbone coupled with a Multi-Scale Gated Fusion (MSGF) module to counteract the varying degrees of sensor degradation under adverse weather conditions. Experimental evaluation on a VoD dataset with simulated fog proves that L4DR is more adaptable to changing weather conditions. It delivers a significant performance increase under different fog levels, improving the 3D mAP by up to 20.0% over the traditional LiDAR-only approach. Moreover, the results on the K-Radar dataset validate the consistent performance improvement of L4DR in real-world adverse weather conditions.