Dur360BEV: A Real-world Single 360-degree Camera Dataset and Benchmark for Bird-Eye View Mapping in Autonomous Driving

πŸ“… 2025-03-02
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address hardware redundancy and complex calibration in multi-camera bird’s-eye view (BEV) perception for autonomous driving, this paper proposes a novel single-360Β° spherical-camera BEV generation paradigm. Methodologically, we introduce (1) Dur360BEVβ€”the first BEV mapping benchmark tailored for autonomous driving using a single spherical camera; (2) SI2BEV, an end-to-end module enabling geometrically consistent spherical-to-BEV projection; (3) LiDAR-assisted supervision combined with RTK-GNSS/INS joint calibration to enhance 3D geometric accuracy; and (4) a modified Focal Loss specifically designed to mitigate extreme class imbalance in BEV semantic segmentation. Experiments demonstrate that our approach achieves comparable overall performance to multi-camera baselines on Dur360BEV, with significant improvements in IoU for small objects and sparse classes. This work establishes a new pathway toward lightweight, robust, and calibration-efficient BEV perception.

Technology Category

Application Category

πŸ“ Abstract
We present Dur360BEV, a novel spherical camera autonomous driving dataset equipped with a high-resolution 128-channel 3D LiDAR and a RTK-refined GNSS/INS system, along with a benchmark architecture designed to generate Bird-Eye-View (BEV) maps using only a single spherical camera. This dataset and benchmark address the challenges of BEV generation in autonomous driving, particularly by reducing hardware complexity through the use of a single 360-degree camera instead of multiple perspective cameras. Within our benchmark architecture, we propose a novel spherical-image-to-BEV (SI2BEV) module that leverages spherical imagery and a refined sampling strategy to project features from 2D to 3D. Our approach also includes an innovative application of Focal Loss, specifically adapted to address the extreme class imbalance often encountered in BEV segmentation tasks. Through extensive experiments, we demonstrate that this application of Focal Loss significantly improves segmentation performance on the Dur360BEV dataset. The results show that our benchmark not only simplifies the sensor setup but also achieves competitive performance.
Problem

Research questions and friction points this paper is trying to address.

Generates Bird-Eye-View maps using a single 360-degree camera.
Reduces hardware complexity in autonomous driving systems.
Improves BEV segmentation performance with Focal Loss adaptation.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Single 360-degree camera for BEV mapping
Spherical-image-to-BEV module with refined sampling
Focal Loss adaptation for BEV segmentation
πŸ”Ž Similar Papers
No similar papers found.
E
E Wenke
Department of Computer Science, Durham University, UK
Chao Yuan
Chao Yuan
Hunan University
Polymer Nanocomposites Nanodielectrics Energy Storage
L
Li Li
Department of Computer Science, Durham University, UK
Yixin Sun
Yixin Sun
Assistant Professor, University of Virginia
Security and PrivacyNetwork Measurement
Y
Yona Falinie A. Gaus
Department of Computer Science, Durham University, UK
Amir Atapour-Abarghouei
Amir Atapour-Abarghouei
Department of Computer Science, Durham University
Machine LearningDeep LearningComputer VisionImage ProcessingNatural Language Processing
T
Toby P. Breckon
Department of Computer Science and Department of Engineering, Durham University, UK