π€ AI Summary
To address tracking drift, map distortion, and rendering artifacts in monocular RGB SLAM caused by dynamic objects in dynamic environments, this paper proposes the first monocular Gaussian splatting SLAM system tailored for dynamic scenes. Our method introduces three key innovations: (1) an uncertainty-aware geometric map enabling robust dynamic object removal guided by per-vertex uncertainty; (2) integration of uncertainty prediction into dense bundle adjustment (BA) and Gaussian map optimization to enhance dynamic reconstruction accuracy; and (3) a pixel-wise uncertainty estimation module combining shallow MLPs with DINOv2 features, driving uncertainty-weighted Gaussian splatting mapping from monocular RGB input. Evaluated on multiple dynamic datasets, our approach achieves artifact-free novel-view synthesis, outperforming state-of-the-art monocular SLAM and neural rendering methods in both tracking and reconstruction accuracy.
π Abstract
We present WildGS-SLAM, a robust and efficient monocular RGB SLAM system designed to handle dynamic environments by leveraging uncertainty-aware geometric mapping. Unlike traditional SLAM systems, which assume static scenes, our approach integrates depth and uncertainty information to enhance tracking, mapping, and rendering performance in the presence of moving objects. We introduce an uncertainty map, predicted by a shallow multi-layer perceptron and DINOv2 features, to guide dynamic object removal during both tracking and mapping. This uncertainty map enhances dense bundle adjustment and Gaussian map optimization, improving reconstruction accuracy. Our system is evaluated on multiple datasets and demonstrates artifact-free view synthesis. Results showcase WildGS-SLAM's superior performance in dynamic environments compared to state-of-the-art methods.