π€ AI Summary
To address hardware redundancy and complex calibration in multi-camera birdβs-eye view (BEV) perception for autonomous driving, this paper proposes a novel single-360Β° spherical-camera BEV generation paradigm. Methodologically, we introduce (1) Dur360BEVβthe first BEV mapping benchmark tailored for autonomous driving using a single spherical camera; (2) SI2BEV, an end-to-end module enabling geometrically consistent spherical-to-BEV projection; (3) LiDAR-assisted supervision combined with RTK-GNSS/INS joint calibration to enhance 3D geometric accuracy; and (4) a modified Focal Loss specifically designed to mitigate extreme class imbalance in BEV semantic segmentation. Experiments demonstrate that our approach achieves comparable overall performance to multi-camera baselines on Dur360BEV, with significant improvements in IoU for small objects and sparse classes. This work establishes a new pathway toward lightweight, robust, and calibration-efficient BEV perception.
π Abstract
We present Dur360BEV, a novel spherical camera autonomous driving dataset equipped with a high-resolution 128-channel 3D LiDAR and a RTK-refined GNSS/INS system, along with a benchmark architecture designed to generate Bird-Eye-View (BEV) maps using only a single spherical camera. This dataset and benchmark address the challenges of BEV generation in autonomous driving, particularly by reducing hardware complexity through the use of a single 360-degree camera instead of multiple perspective cameras. Within our benchmark architecture, we propose a novel spherical-image-to-BEV (SI2BEV) module that leverages spherical imagery and a refined sampling strategy to project features from 2D to 3D. Our approach also includes an innovative application of Focal Loss, specifically adapted to address the extreme class imbalance often encountered in BEV segmentation tasks. Through extensive experiments, we demonstrate that this application of Focal Loss significantly improves segmentation performance on the Dur360BEV dataset. The results show that our benchmark not only simplifies the sensor setup but also achieves competitive performance.