🤖 AI Summary
This work addresses the challenge of overfitting and poor generalization in 3D Gaussian splatting for LiDAR view synthesis under extrapolated viewpoints, which limits its applicability to unseen driving trajectories in autonomous driving simulation. To overcome this, we propose the LiDAR-EVS framework, which leverages multi-frame LiDAR fusion to generate pseudo-extrapolated point clouds as supervision signals and introduces a spatially constrained Dropout regularization mechanism to enhance adaptability to real-world trajectory variations. The method is designed as a lightweight, plug-and-play extension that requires no modification to the backbone network. Evaluated on three benchmark datasets, LiDAR-EVS significantly improves the fidelity and robustness of LiDAR simulation under extrapolated views, achieving state-of-the-art performance and demonstrating strong potential for data-driven autonomous driving simulation and closed-loop evaluation.
📝 Abstract
3D Gaussian Splatting (3DGS) has emerged as a powerful technique for real-time LiDAR and camera synthesis in autonomous driving simulation. However, simulating LiDAR with 3DGS remains challenging for extrapolated views beyond the training trajectory, as existing methods are typically trained on single-traversal sensor scans, suffer from severe overfitting and poor generalization to novel ego-vehicle paths. To enable reliable simulation of LiDAR along unseen driving trajectories without external multi-pass data, we present LiDAR-EVS, a lightweight framework for robust extrapolated-view LiDAR simulation in autonomous driving. Designed to be plug-and-play, LiDAR-EVS readily extends to diverse LiDAR sensors and neural rendering baselines with minimal modification. Our framework comprises two key components: (1) pseudo extrapolated-view point cloud supervision with multi-frame LiDAR fusion, view transformation, occlusion curling, and intensity adjustment; (2) spatially-constrained dropout regularization that promotes robustness to diverse trajectory variations encountered in real-world driving. Extensive experiments demonstrate that LiDAR-EVS achieves SOTA performance on extrapolated-view LiDAR synthesis across three datasets, making it a promising tool for data-driven simulation, closed-loop evaluation, and synthetic data generation in autonomous driving systems.