🤖 AI Summary
Existing SLAM and autonomous navigation research is hindered by insufficient multimodal datasets—particularly in sensor modality coverage, environmental diversity, and hardware reproducibility—undermining the reliability and comparability of algorithm evaluation. To address this, we propose SMapper, an open-source, reproducible multimodal SLAM benchmark platform integrating synchronized LiDAR, multi-camera arrays, and IMU for high-precision data acquisition across diverse indoor and outdoor scenarios. We design a tightly coupled spatiotemporal calibration pipeline and an open-hardware implementation to ensure cross-platform consistency. Furthermore, we release SMapper-light, a benchmark dataset featuring sub-centimeter ground-truth trajectories and dense 3D reconstructions. Leveraging this dataset, we conduct standardized benchmarking of state-of-the-art visual and LiDAR-based SLAM systems. Our work significantly enhances reproducibility, comparability, and methodological rigor in SLAM evaluation.
📝 Abstract
Advancing research in fields like Simultaneous Localization and Mapping (SLAM) and autonomous navigation critically depends on reliable and reproducible multimodal datasets. While several influential datasets have driven progress in these domains, they often suffer from limitations in sensing modalities, environmental diversity, and the reproducibility of the underlying hardware setups. To address these challenges, this paper introduces SMapper, a novel open-hardware, multi-sensor platform designed explicitly for, though not limited to, SLAM research. The device integrates synchronized LiDAR, multi-camera, and inertial sensing, supported by a robust calibration and synchronization pipeline that ensures precise spatio-temporal alignment across modalities. Its open and replicable design allows researchers to extend its capabilities and reproduce experiments across both handheld and robot-mounted scenarios. To demonstrate its practicality, we additionally release SMapper-light, a publicly available SLAM dataset containing representative indoor and outdoor sequences. The dataset includes tightly synchronized multimodal data and ground-truth trajectories derived from offline LiDAR-based SLAM with sub-centimeter accuracy, alongside dense 3D reconstructions. Furthermore, the paper contains benchmarking results on state-of-the-art LiDAR and visual SLAM frameworks using the SMapper-light dataset. By combining open-hardware design, reproducible data collection, and comprehensive benchmarking, SMapper establishes a robust foundation for advancing SLAM algorithm development, evaluation, and reproducibility.