🤖 AI Summary
To address the challenge of real-time, robust simultaneous localization and mapping (SLAM) in extreme underground environments—characterized by GPS denial, low illumination, high dust concentrations, and structural repetitiveness—this paper proposes a complementary hierarchical multimodal SLAM framework. It tightly fuses visual, LiDAR, inertial measurement unit (IMU), and motion-model data for unified state estimation. A cross-modal pose estimation algorithm integrates hierarchical filtering and optimization, enabling online degeneracy detection and adaptive modality fallback. Furthermore, we introduce the first marsupial-style multi-robot collaborative mapping architecture with dynamic map sharing. Evaluated on the DARPA Subterranean (SubT) Challenge course—a 740-meter complex underground track—our system underpinned Team Cerberus’ championship win, demonstrating superior robustness and real-time performance. We publicly release the complete source code and the full SubT Finals dataset to advance the underground robotics SLAM community.
📝 Abstract
Robot autonomy in unknown, GPS-denied, and complex underground environments requires real-time, robust, and accurate onboard pose estimation and mapping for reliable operations. This becomes particularly challenging in perception-degraded subterranean conditions under harsh environmental factors, including darkness, dust, and geometrically self-similar structures. This paper details CompSLAM, a highly resilient and hierarchical multi-modal localization and mapping framework designed to address these challenges. Its flexible architecture achieves resilience through redundancy by leveraging the complementary nature of pose estimates derived from diverse sensor modalities. Developed during the DARPA Subterranean Challenge, CompSLAM was successfully deployed on all aerial, legged, and wheeled robots of Team Cerberus during their competition-winning final run. Furthermore, it has proven to be a reliable odometry and mapping solution in various subsequent projects, with extensions enabling multi-robot map sharing for marsupial robotic deployments and collaborative mapping. This paper also introduces a comprehensive dataset acquired by a manually teleoperated quadrupedal robot, covering a significant portion of the DARPA Subterranean Challenge finals course. This dataset evaluates CompSLAM's robustness to sensor degradations as the robot traverses 740 meters in an environment characterized by highly variable geometries and demanding lighting conditions. The CompSLAM code and the DARPA SubT Finals dataset are made publicly available for the benefit of the robotics community