SMapper: A Multi-Modal Data Acquisition Platform for SLAM Benchmarking

📅 2025-09-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing SLAM and autonomous navigation research is hindered by insufficient multimodal datasets—particularly in sensor modality coverage, environmental diversity, and hardware reproducibility—undermining the reliability and comparability of algorithm evaluation. To address this, we propose SMapper, an open-source, reproducible multimodal SLAM benchmark platform integrating synchronized LiDAR, multi-camera arrays, and IMU for high-precision data acquisition across diverse indoor and outdoor scenarios. We design a tightly coupled spatiotemporal calibration pipeline and an open-hardware implementation to ensure cross-platform consistency. Furthermore, we release SMapper-light, a benchmark dataset featuring sub-centimeter ground-truth trajectories and dense 3D reconstructions. Leveraging this dataset, we conduct standardized benchmarking of state-of-the-art visual and LiDAR-based SLAM systems. Our work significantly enhances reproducibility, comparability, and methodological rigor in SLAM evaluation.

Technology Category

Application Category

📝 Abstract
Advancing research in fields like Simultaneous Localization and Mapping (SLAM) and autonomous navigation critically depends on reliable and reproducible multimodal datasets. While several influential datasets have driven progress in these domains, they often suffer from limitations in sensing modalities, environmental diversity, and the reproducibility of the underlying hardware setups. To address these challenges, this paper introduces SMapper, a novel open-hardware, multi-sensor platform designed explicitly for, though not limited to, SLAM research. The device integrates synchronized LiDAR, multi-camera, and inertial sensing, supported by a robust calibration and synchronization pipeline that ensures precise spatio-temporal alignment across modalities. Its open and replicable design allows researchers to extend its capabilities and reproduce experiments across both handheld and robot-mounted scenarios. To demonstrate its practicality, we additionally release SMapper-light, a publicly available SLAM dataset containing representative indoor and outdoor sequences. The dataset includes tightly synchronized multimodal data and ground-truth trajectories derived from offline LiDAR-based SLAM with sub-centimeter accuracy, alongside dense 3D reconstructions. Furthermore, the paper contains benchmarking results on state-of-the-art LiDAR and visual SLAM frameworks using the SMapper-light dataset. By combining open-hardware design, reproducible data collection, and comprehensive benchmarking, SMapper establishes a robust foundation for advancing SLAM algorithm development, evaluation, and reproducibility.
Problem

Research questions and friction points this paper is trying to address.

Addressing limitations in multimodal SLAM datasets' sensing and diversity
Providing synchronized LiDAR, camera, and inertial sensing platform
Enabling reproducible SLAM research through open-hardware design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Open-hardware multi-sensor platform design
Synchronized LiDAR camera inertial sensing
Reproducible calibration and synchronization pipeline
🔎 Similar Papers
No similar papers found.
P
Pedro Miguel Bastos Soares
Automation and Robotics Research Group (ARG), Interdisciplinary Centre for Security, Reliability, and Trust (SnT), University of Luxembourg, L-1359 Luxembourg, Luxembourg
Ali Tourani
Ali Tourani
Interdisciplinary Centre for Security, Reliability, and Trust (SnT), University of Luxembourg
Computer VisionVisual SLAMSituational AwarenessInterdisciplinarity
Miguel Fernandez-Cortizas
Miguel Fernandez-Cortizas
Postdoctoral researcher, ARG-SnT, University of Luxembourg
RoboticsMachine LearningReinforcement LearningUAVMAV
A
Asier Bikandi Noya
Automation and Robotics Research Group (ARG), Interdisciplinary Centre for Security, Reliability, and Trust (SnT), University of Luxembourg, L-1359 Luxembourg, Luxembourg
Jose Luis Sanchez-Lopez
Jose Luis Sanchez-Lopez
Research Scientist at SnT - University of Luxembourg
Aerial RoboticsSituational AwarenessTrajectory PlanningPerceptionSLAM
Holger Voos
Holger Voos
University of Luxembourg, SnT Automation & Robotics Research Group
Control EngineeringAutomationMobile Robotics