🤖 AI Summary
To address the lack of dedicated multimodal tracking datasets and fusion methods for inland autonomous navigation, this paper introduces USVTrack—the first 4D radar–monocular camera multimodal tracking dataset specifically designed for unmanned surface vehicles (USVs), encompassing diverse water conditions, illumination levels, and weather scenarios. We propose RCM (Radar–Camera Matching), a plug-and-play cross-modal matching method that achieves spatiotemporal alignment and trajectory association between radar and image modalities without requiring precise sensor calibration or prior motion models. Leveraging tightly integrated 4D radar, camera, GPS, and IMU data, RCM significantly improves tracking accuracy and robustness in challenging aquatic environments. USVTrack is publicly released, establishing the first benchmark dataset and reproducible multimodal tracking framework for maritime autonomous driving.
📝 Abstract
Object tracking in inland waterways plays a crucial role in safe and cost-effective applications, including waterborne transportation, sightseeing tours, environmental monitoring and surface rescue. Our Unmanned Surface Vehicle (USV), equipped with a 4D radar, a monocular camera, a GPS, and an IMU, delivers robust tracking capabilities in complex waterborne environments. By leveraging these sensors, our USV collected comprehensive object tracking data, which we present as USVTrack, the first 4D radar-camera tracking dataset tailored for autonomous driving in new generation waterborne transportation systems. Our USVTrack dataset presents rich scenarios, featuring diverse various waterways, varying times of day, and multiple weather and lighting conditions. Moreover, we present a simple but effective radar-camera matching method, termed RCM, which can be plugged into popular two-stage association trackers. Experimental results utilizing RCM demonstrate the effectiveness of the radar-camera matching in improving object tracking accuracy and reliability for autonomous driving in waterborne environments. The USVTrack dataset is public on https://usvtrack.github.io.