GND: Global Navigation Dataset with Multi-Modal Perception and Multi-Category Traversability in Outdoor Campus Environments

📅 2024-09-21
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Mobile robots operating in large-scale outdoor environments face two key challenges: heavy reliance on pre-built high-precision maps and limited commonsense reasoning capabilities. To address these, this paper introduces GND (Global Navigation Dataset), the first multimodal, commonsense-aware navigation dataset for campus-scale outdoor environments. GND spans ten university campuses (totaling 2.7 km²), with synchronized 3D LiDAR, RGB, and 360° imagery, and includes fine-grained traversability annotations across five semantic classes—sidewalks, roadways, stairs, off-road terrain, and obstacles—aligned via cross-campus georegistration and scale normalization. Crucially, GND is the first dataset to systematically integrate multimodal sensory inputs with multi-class traversability maps, enabling diverse navigation paradigms including map-based, map-free, and global localization. Experiments demonstrate substantial improvements in model generalization and robustness across global navigation, map-free navigation, and place recognition tasks, thereby bridging a critical gap in outdoor commonsense reasoning benchmarks.

Technology Category

Application Category

📝 Abstract
Navigating large-scale outdoor environments requires complex reasoning in terms of geometric structures, environmental semantics, and terrain characteristics, which are typically captured by onboard sensors such as LiDAR and cameras. While current mobile robots can navigate such environments using pre-defined, high-precision maps based on hand-crafted rules catered for the specific environment, they lack commonsense reasoning capabilities that most humans possess when navigating unknown outdoor spaces. To address this gap, we introduce the Global Navigation Dataset (GND), a large-scale dataset that integrates multi-modal sensory data, including 3D LiDAR point clouds and RGB and 360-degree images, as well as multi-category traversability maps (pedestrian walkways, vehicle roadways, stairs, off-road terrain, and obstacles) from ten university campuses. These environments encompass a variety of parks, urban settings, elevation changes, and campus layouts of different scales. The dataset covers approximately 2.7km2 and includes at least 350 buildings in total. We also present a set of novel applications of GND to showcase its utility to enable global robot navigation, such as map-based global navigation, mapless navigation, and global place recognition.
Problem

Research questions and friction points this paper is trying to address.

Enhance robot navigation in outdoor environments using multi-modal sensory data.
Address lack of commonsense reasoning in robots navigating unknown outdoor spaces.
Provide a dataset for global navigation with diverse traversability categories.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates multi-modal sensory data for navigation
Includes multi-category traversability maps
Enables global robot navigation applications
🔎 Similar Papers
No similar papers found.