🤖 AI Summary
Bird’s-eye view (BEV) semantic segmentation of surround-view fisheye images suffers from low accuracy due to extreme nonlinear distortion, occlusions, and depth ambiguity. Method: We propose an end-to-end geometry-aware framework: first, multi-camera joint calibration enables Gaussian-parameterized back-projection of fisheye pixels into 3D space, explicitly modeling anisotropic geometric uncertainty; second, a differentiable splatting fusion mechanism performs continuous, uncertainty-aware semantic aggregation directly in BEV—bypassing conventional distortion correction and perspective warping. The method operates natively on raw high-resolution fisheye inputs and supports depth distribution estimation and cross-camera consistency optimization. Results: On challenging parking and urban driving scenes, our approach achieves 87.75% IoU for drivable area and 57.26% IoU for vehicle segmentation—significantly improving robustness and accuracy under severe distortion and dynamic environments.
📝 Abstract
Accurate BEV semantic segmentation from fisheye imagery remains challenging due to extreme non-linear distortion, occlusion, and depth ambiguity inherent to wide-angle projections. We present a distortion-aware BEV segmentation framework that directly processes multi-camera high-resolution fisheye images,utilizing calibrated geometric unprojection and per-pixel depth distribution estimation. Each image pixel is lifted into 3D space via Gaussian parameterization, predicting spatial means and anisotropic covariances to explicitly model geometric uncertainty. The projected 3D Gaussians are fused into a BEV representation via differentiable splatting, producing continuous, uncertainty-aware semantic maps without requiring undistortion or perspective rectification. Extensive experiments demonstrate strong segmentation performance on complex parking and urban driving scenarios, achieving IoU scores of 87.75% for drivable regions and 57.26% for vehicles under severe fisheye distortion and diverse environmental conditions.