MOANA: Multi-Radar Dataset for Maritime Odometry and Autonomous Navigation Application

📅 2024-12-05
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address challenges in maritime autonomous navigation—including adverse weather, platform motion disturbances, imbalanced near- and far-field perception, and difficulty in detecting dynamic targets—this paper introduces MarineFusion, the first multi-radar fusion dataset tailored to maritime scenarios. It integrates short-range LiDAR, medium-range W-band radar, and long-range X-band radar, covering diverse sea states and berthing complexities. We present the first cross-frequency, cross-range calibration framework for X/W-band radars and LiDAR, delivering temporally and spatially synchronized, ground-truth-annotated multi-sensor sequences. A novel joint annotation pipeline for radar point clouds and stereo vision, along with a multi-scale segmentation framework for marine dynamic objects, is proposed; seven real-world sequences are released. Experiments demonstrate a 32% improvement in near-field obstacle detection accuracy and a 27% reduction in odometry error, significantly advancing downstream tasks including SLAM, maritime target detection, and robust localization.

Technology Category

Application Category

📝 Abstract
Maritime environmental sensing requires overcoming challenges from complex conditions such as harsh weather, platform perturbations, large dynamic objects, and the requirement for long detection ranges. While cameras and LiDAR are commonly used in ground vehicle navigation, their applicability in maritime settings is limited by range constraints and hardware maintenance issues. Radar sensors, however, offer robust long-range detection capabilities and resilience to physical contamination from weather and saline conditions, making it a powerful sensor for maritime navigation. Among various radar types, X-band radar is widely employed for maritime vessel navigation, providing effective long-range detection essential for situational awareness and collision avoidance. Nevertheless, it exhibits limitations during berthing operations where near-field detection is critical. To address this shortcoming, we incorporate W-band radar, which excels in detecting nearby objects with a higher update rate. We present a comprehensive maritime sensor dataset featuring multi-range detection capabilities. This dataset integrates short-range LiDAR data, medium-range W-band radar data, and long-range X-band radar data into a unified framework. Additionally, it includes object labels for oceanic object detection usage, derived from radar and stereo camera images. The dataset comprises seven sequences collected from diverse regions with varying levels of l{navigation algorithm} estimation difficulty, ranging from easy to challenging, and includes common locations suitable for global localization tasks. This dataset serves as a valuable resource for advancing research in place recognition, odometry estimation, SLAM, object detection, and dynamic object elimination within maritime environments. Dataset can be found at https://sites.google.com/view/rpmmoana.
Problem

Research questions and friction points this paper is trying to address.

Overcoming maritime sensing challenges in harsh conditions
Integrating multi-range radar data for navigation
Providing a dataset for maritime odometry and SLAM
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines X-band and W-band radars for maritime navigation
Integrates LiDAR, W-band, and X-band radar data
Provides labeled dataset for oceanic object detection
🔎 Similar Papers
No similar papers found.
Hyesu Jang
Hyesu Jang
Seoul National University
SLAMRoboticsRadarMarineUnderwater
W
Wooseong Yang
Dept. of Mechanical Engineering, SNU, Seoul, S. Korea
Hanguen Kim
Hanguen Kim
Seadronix, Seoul, S. Korea
D
Dongje Lee
Seadronix, Seoul, S. Korea
Yongjin Kim
Yongjin Kim
Seadronix, Seoul, S. Korea
J
Jinbum Park
Seadronix, Seoul, S. Korea
M
Minsoo Jeon
Seadronix, Seoul, S. Korea
J
Jaeseong Koh
Seadronix, Seoul, S. Korea
Y
Yejin Kang
Seadronix, Seoul, S. Korea
Minwoo Jung
Minwoo Jung
Seoul National University
SLAMPlace RecognitionLiDARCross-modal Sensors
S
Sangwoo Jung
Dept. of Mechanical Engineering, SNU, Seoul, S. Korea
C
Chng Zhen Hao
Defence Science and Technology Agency, Singapore
W
Wong Yu Hin
Defence Science and Technology Agency, Singapore
C
Chew Yihang
Defence Science and Technology Agency, Singapore
Ayoung Kim
Ayoung Kim
Seoul National University
SLAMUnderwater Robotnavigationmappingcomputer vision