VG-Mapping: Variation-Aware 3D Gaussians for Online Semi-static Scene Mapping

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the map-update latency in online 3D reconstruction for semi-static scenes, this paper proposes an incremental reconstruction framework integrating 3D Gaussian Splatting (3DGS) with Truncated Signed Distance Function (TSDF) voxel maps. Methodologically, we design a variability-aware density control strategy to intelligently insert or prune Gaussian primitives; incorporate a change-detection mechanism to trigger localized map updates; and introduce the first publicly available RGB-D dataset tailored for semi-static scene evaluation. Our key contributions are: (i) the first hybrid representation combining differentiable 3DGS rendering with geometrically consistent TSDF maps; and (ii) substantial improvements in synthetic and real-world experiments—achieving +2.1 dB PSNR gain in rendering quality, 2.3× faster map update efficiency, and enhanced localization robustness—thereby ensuring both mapping accuracy and real-time performance under dynamic conditions.

Technology Category

Application Category

📝 Abstract
Maintaining an up-to-date map that accurately reflects recent changes in the environment is crucial, especially for robots that repeatedly traverse the same space. Failing to promptly update the changed regions can degrade map quality, resulting in poor localization, inefficient operations, and even lost robots. 3D Gaussian Splatting (3DGS) has recently seen widespread adoption in online map reconstruction due to its dense, differentiable, and photorealistic properties, yet accurately and efficiently updating the regions of change remains a challenge. In this paper, we propose VG-Mapping, a novel online 3DGS-based mapping system tailored for such semi-static scenes. Our approach introduces a hybrid representation that augments 3DGS with a TSDF-based voxel map to efficiently identify changed regions in a scene, along with a variation-aware density control strategy that inserts or deletes Gaussian primitives in regions undergoing change. Furthermore, to address the absence of public benchmarks for this task, we construct a RGB-D dataset comprising both synthetic and real-world semi-static environments. Experimental results demonstrate that our method substantially improves the rendering quality and map update efficiency in semi-static scenes. The code and dataset are available at https://github.com/heyicheng-never/VG-Mapping.
Problem

Research questions and friction points this paper is trying to address.

Online mapping for semi-static scenes with environmental changes
Efficiently updating changed regions in 3D Gaussian Splatting maps
Improving rendering quality and update efficiency in dynamic environments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid 3DGS-TSDF representation for change detection
Variation-aware density control for Gaussian updates
Novel RGB-D dataset for semi-static benchmarking
🔎 Similar Papers
No similar papers found.
Yicheng He
Yicheng He
Columbia University
video understandingcomputer vision
J
Jingwen Yu
Shenzhen Key Laboratory of Robotics and Computer Vision, Southern University of Science and Technology, Shenzhen, China; CKS Robotics Institute, Hong Kong University of Science and Technology, Hong Kong SAR, China
Guangcheng Chen
Guangcheng Chen
Southern University of Science and Technology
Polarimetric imaging3D reconstruction
H
Hong Zhang
Shenzhen Key Laboratory of Robotics and Computer Vision, Southern University of Science and Technology, Shenzhen, China