CU-Multi: A Dataset for Multi-Robot Data Association

πŸ“… 2025-05-23
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Multi-robot data association faces significant challenges in real-world scenarios, including difficulty in spatiotemporal alignment and strong dependence of observations on robot pose; existing benchmarks predominantly rely on segmented single-trajectory simulations, lacking realism. This paper introduces CU-Multiβ€”the first ground-truth-controllable, multimodal (RGB-D/LiDAR/GPS) benchmark for multi-robot data association. It achieves controllable trajectory overlap and observation diversity via four synchronized runs on a single mobile platform. We propose a novel spatiotemporal-alignment-driven data acquisition paradigm, complemented by dense semantic LiDAR annotations and centimeter-level RTK-GPS with precise geographic heading. CU-Multi overcomes the limitations of single-trajectory segmentation, enabling fair, standardized evaluation of multi-robot data association, collaborative SLAM, and cross-robot loop closure detection. The dataset is publicly released.

Technology Category

Application Category

πŸ“ Abstract
Multi-robot systems (MRSs) are valuable for tasks such as search and rescue due to their ability to coordinate over shared observations. A central challenge in these systems is aligning independently collected perception data across space and time, i.e., multi-robot data association. While recent advances in collaborative SLAM (C-SLAM), map merging, and inter-robot loop closure detection have significantly progressed the field, evaluation strategies still predominantly rely on splitting a single trajectory from single-robot SLAM datasets into multiple segments to simulate multiple robots. Without careful consideration to how a single trajectory is split, this approach will fail to capture realistic pose-dependent variation in observations of a scene inherent to multi-robot systems. To address this gap, we present CU-Multi, a multi-robot dataset collected over multiple days at two locations on the University of Colorado Boulder campus. Using a single robotic platform, we generate four synchronized runs with aligned start times and deliberate percentages of trajectory overlap. CU-Multi includes RGB-D, GPS with accurate geospatial heading, and semantically annotated LiDAR data. By introducing controlled variations in trajectory overlap and dense lidar annotations, CU-Multi offers a compelling alternative for evaluating methods in multi-robot data association. Instructions on accessing the dataset, support code, and the latest updates are publicly available at https://arpg.github.io/cumulti
Problem

Research questions and friction points this paper is trying to address.

Aligning perception data across multi-robot systems
Evaluating data association without realistic pose variations
Providing a dataset for multi-robot trajectory overlap analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-robot dataset with synchronized runs
Includes RGB-D, GPS, and annotated LiDAR
Controlled trajectory overlap for evaluation
πŸ”Ž Similar Papers
2023-12-04IEEE/RJS International Conference on Intelligent RObots and SystemsCitations: 0