🤖 AI Summary
Existing GS-SLAM methods suffer from tracking drift and geometric degradation under sustained, severe motion blur. To address this, we propose the first robust GS-SLAM framework integrating event cameras with RGB-D data. Our method (1) explicitly models the continuous camera trajectory during exposure to decouple motion blur; (2) introduces a learnable event-to-image dynamic range alignment response function to enhance cross-modal consistency; and (3) designs an event-free supervised reconstruction loss to effectively suppress artifacts in Gaussian splatting. Evaluated on both synthetic and real-world blurred scenes, our approach significantly outperforms state-of-the-art GS-SLAM methods: pose estimation error is reduced by up to 42%, while geometric accuracy and texture fidelity of reconstructions are substantially improved. Crucially, it maintains stable tracking and high-fidelity 3D reconstruction even under extreme motion blur.
📝 Abstract
Gaussian Splatting SLAM (GS-SLAM) offers a notable improvement over traditional SLAM methods, enabling photorealistic 3D reconstruction that conventional approaches often struggle to achieve. However, existing GS-SLAM systems perform poorly under persistent and severe motion blur commonly encountered in real-world scenarios, leading to significantly degraded tracking accuracy and compromised 3D reconstruction quality. To address this limitation, we propose EGS-SLAM, a novel GS-SLAM framework that fuses event data with RGB-D inputs to simultaneously reduce motion blur in images and compensate for the sparse and discrete nature of event streams, enabling robust tracking and high-fidelity 3D Gaussian Splatting reconstruction. Specifically, our system explicitly models the camera's continuous trajectory during exposure, supporting event- and blur-aware tracking and mapping on a unified 3D Gaussian Splatting scene. Furthermore, we introduce a learnable camera response function to align the dynamic ranges of events and images, along with a no-event loss to suppress ringing artifacts during reconstruction. We validate our approach on a new dataset comprising synthetic and real-world sequences with significant motion blur. Extensive experimental results demonstrate that EGS-SLAM consistently outperforms existing GS-SLAM systems in both trajectory accuracy and photorealistic 3D Gaussian Splatting reconstruction. The source code will be available at https://github.com/Chensiyu00/EGS-SLAM.