🤖 AI Summary
To address the map-update latency in online 3D reconstruction for semi-static scenes, this paper proposes an incremental reconstruction framework integrating 3D Gaussian Splatting (3DGS) with Truncated Signed Distance Function (TSDF) voxel maps. Methodologically, we design a variability-aware density control strategy to intelligently insert or prune Gaussian primitives; incorporate a change-detection mechanism to trigger localized map updates; and introduce the first publicly available RGB-D dataset tailored for semi-static scene evaluation. Our key contributions are: (i) the first hybrid representation combining differentiable 3DGS rendering with geometrically consistent TSDF maps; and (ii) substantial improvements in synthetic and real-world experiments—achieving +2.1 dB PSNR gain in rendering quality, 2.3× faster map update efficiency, and enhanced localization robustness—thereby ensuring both mapping accuracy and real-time performance under dynamic conditions.
📝 Abstract
Maintaining an up-to-date map that accurately reflects recent changes in the environment is crucial, especially for robots that repeatedly traverse the same space. Failing to promptly update the changed regions can degrade map quality, resulting in poor localization, inefficient operations, and even lost robots. 3D Gaussian Splatting (3DGS) has recently seen widespread adoption in online map reconstruction due to its dense, differentiable, and photorealistic properties, yet accurately and efficiently updating the regions of change remains a challenge. In this paper, we propose VG-Mapping, a novel online 3DGS-based mapping system tailored for such semi-static scenes. Our approach introduces a hybrid representation that augments 3DGS with a TSDF-based voxel map to efficiently identify changed regions in a scene, along with a variation-aware density control strategy that inserts or deletes Gaussian primitives in regions undergoing change. Furthermore, to address the absence of public benchmarks for this task, we construct a RGB-D dataset comprising both synthetic and real-world semi-static environments. Experimental results demonstrate that our method substantially improves the rendering quality and map update efficiency in semi-static scenes. The code and dataset are available at https://github.com/heyicheng-never/VG-Mapping.