GS-LiDAR: Generating Realistic LiDAR Point Clouds with Panoramic Gaussian Splatting

πŸ“… 2025-01-22
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the poor real-time performance, high energy consumption, and weak scene adaptability of LiDAR novel-view synthesis in autonomous driving simulation, this paper proposes Panoramic Gaussian Rasterization. Our method introduces three key innovations: (1) modeling dynamic driving scenes using 2D Gaussian primitives with periodic oscillatory properties; (2) designing a panoramic rendering pipeline that explicitly computes ray–Gaussian intersection points; and (3) integrating intensity-aware modeling with spherical harmonic coefficient-guided ray culling to jointly enhance geometric fidelity and radiometric realism of synthesized point clouds. Evaluated on KITTI-360 and nuScenes, our approach achieves 3–5Γ— faster training and rendering than NeRF-based methods, while significantly outperforming them in PSNR and SSIM. It delivers both high visual realism and strong geometric consistency, enabling efficient, scalable, and physically plausible LiDAR simulation for autonomous driving.

Technology Category

Application Category

πŸ“ Abstract
LiDAR novel view synthesis (NVS) has emerged as a novel task within LiDAR simulation, offering valuable simulated point cloud data from novel viewpoints to aid in autonomous driving systems. However, existing LiDAR NVS methods typically rely on neural radiance fields (NeRF) as their 3D representation, which incurs significant computational costs in both training and rendering. Moreover, NeRF and its variants are designed for symmetrical scenes, making them ill-suited for driving scenarios. To address these challenges, we propose GS-LiDAR, a novel framework for generating realistic LiDAR point clouds with panoramic Gaussian splatting. Our approach employs 2D Gaussian primitives with periodic vibration properties, allowing for precise geometric reconstruction of both static and dynamic elements in driving scenarios. We further introduce a novel panoramic rendering technique with explicit ray-splat intersection, guided by panoramic LiDAR supervision. By incorporating intensity and ray-drop spherical harmonic (SH) coefficients into the Gaussian primitives, we enhance the realism of the rendered point clouds. Extensive experiments on KITTI-360 and nuScenes demonstrate the superiority of our method in terms of quantitative metrics, visual quality, as well as training and rendering efficiency.
Problem

Research questions and friction points this paper is trying to address.

LiDAR
NeRF
Autonomous_Vehicles
Innovation

Methods, ideas, or system contributions that make the work stand out.

GS-LiDAR
Panoramic Gaussian Splatter
Autonomous Driving Optimization
πŸ”Ž Similar Papers
No similar papers found.
J
Junzhe Jiang
School of Data Science, Fudan University
Chun Gu
Chun Gu
Fudan University
3D reconstruction
Y
Yurui Chen
School of Data Science, Fudan University
L
Li Zhang
School of Data Science, Fudan University