RadarSplat: Radar Gaussian Splatting for High-Fidelity Data Synthesis and 3D Reconstruction of Autonomous Driving Scenes

📅 2025-06-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient perception robustness caused by severe noise in radar data under adverse weather conditions—such as receiver saturation and multipath reflections—this paper proposes the first end-to-end, high-fidelity framework for raw radar data synthesis and geometrically consistent 3D scene reconstruction. Our method innovatively integrates Gaussian rasterization with a physics-inspired differentiable radar imaging model to explicitly simulate receiver saturation and multipath effects, while incorporating noise-aware volumetric rendering and multi-sensor geometric constraints for optimization. Evaluated on rain, fog, and snow scenes, our approach achieves a 3.4 dB PSNR gain and a 2.6× SSIM improvement in synthetic radar image quality; 3D reconstruction RMSE is reduced by 40%, yielding a 1.5× accuracy gain. This work overcomes the limitations of conventional methods—namely, sensitivity to noise and reliance on preprocessed imagery—and establishes the first radar signal-level, joint synthesis-and-reconstruction pipeline, significantly enhancing perception reliability for autonomous driving in high-noise environments.

Technology Category

Application Category

📝 Abstract
High-Fidelity 3D scene reconstruction plays a crucial role in autonomous driving by enabling novel data generation from existing datasets. This allows simulating safety-critical scenarios and augmenting training datasets without incurring further data collection costs. While recent advances in radiance fields have demonstrated promising results in 3D reconstruction and sensor data synthesis using cameras and LiDAR, their potential for radar remains largely unexplored. Radar is crucial for autonomous driving due to its robustness in adverse weather conditions like rain, fog, and snow, where optical sensors often struggle. Although the state-of-the-art radar-based neural representation shows promise for 3D driving scene reconstruction, it performs poorly in scenarios with significant radar noise, including receiver saturation and multipath reflection. Moreover, it is limited to synthesizing preprocessed, noise-excluded radar images, failing to address realistic radar data synthesis. To address these limitations, this paper proposes RadarSplat, which integrates Gaussian Splatting with novel radar noise modeling to enable realistic radar data synthesis and enhanced 3D reconstruction. Compared to the state-of-the-art, RadarSplat achieves superior radar image synthesis (+3.4 PSNR / 2.6x SSIM) and improved geometric reconstruction (-40% RMSE / 1.5x Accuracy), demonstrating its effectiveness in generating high-fidelity radar data and scene reconstruction. A project page is available at https://umautobots.github.io/radarsplat.
Problem

Research questions and friction points this paper is trying to address.

Enables high-fidelity 3D reconstruction for autonomous driving scenes
Addresses radar noise issues in adverse weather conditions
Improves realistic radar data synthesis and geometric accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian Splatting for radar data
Novel radar noise modeling
Enhanced 3D scene reconstruction
🔎 Similar Papers
No similar papers found.