USVTrack: USV-Based 4D Radar-Camera Tracking Dataset for Autonomous Driving in Inland Waterways

📅 2025-06-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the lack of dedicated multimodal tracking datasets and fusion methods for inland autonomous navigation, this paper introduces USVTrack—the first 4D radar–monocular camera multimodal tracking dataset specifically designed for unmanned surface vehicles (USVs), encompassing diverse water conditions, illumination levels, and weather scenarios. We propose RCM (Radar–Camera Matching), a plug-and-play cross-modal matching method that achieves spatiotemporal alignment and trajectory association between radar and image modalities without requiring precise sensor calibration or prior motion models. Leveraging tightly integrated 4D radar, camera, GPS, and IMU data, RCM significantly improves tracking accuracy and robustness in challenging aquatic environments. USVTrack is publicly released, establishing the first benchmark dataset and reproducible multimodal tracking framework for maritime autonomous driving.

Technology Category

Application Category

📝 Abstract
Object tracking in inland waterways plays a crucial role in safe and cost-effective applications, including waterborne transportation, sightseeing tours, environmental monitoring and surface rescue. Our Unmanned Surface Vehicle (USV), equipped with a 4D radar, a monocular camera, a GPS, and an IMU, delivers robust tracking capabilities in complex waterborne environments. By leveraging these sensors, our USV collected comprehensive object tracking data, which we present as USVTrack, the first 4D radar-camera tracking dataset tailored for autonomous driving in new generation waterborne transportation systems. Our USVTrack dataset presents rich scenarios, featuring diverse various waterways, varying times of day, and multiple weather and lighting conditions. Moreover, we present a simple but effective radar-camera matching method, termed RCM, which can be plugged into popular two-stage association trackers. Experimental results utilizing RCM demonstrate the effectiveness of the radar-camera matching in improving object tracking accuracy and reliability for autonomous driving in waterborne environments. The USVTrack dataset is public on https://usvtrack.github.io.
Problem

Research questions and friction points this paper is trying to address.

Develops a 4D radar-camera dataset for inland waterway tracking
Proposes radar-camera matching to enhance tracking accuracy
Addresses autonomous driving challenges in diverse waterborne environments
Innovation

Methods, ideas, or system contributions that make the work stand out.

USV with 4D radar and camera for tracking
First 4D radar-camera dataset for waterways
Radar-camera matching method boosts tracking accuracy
🔎 Similar Papers
No similar papers found.
Shanliang Yao
Shanliang Yao
Yancheng Institute of Technology
Autonomous DrivingIntelligent VehiclesRadar-Camera FusionMaritime Perception
Runwei Guan
Runwei Guan
Hong Kong University of Science and Technology (Guangzhou) / Founder of FertiTech AI
Multi-Modal LearningUnmanned Surface VesselRadar PerceptionAI Medicine
Y
Yi Ni
School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou 215123, China
S
Sen Xu
School of Information Engineering, Yancheng Institute Technology, Yancheng 224051, China
Yong Yue
Yong Yue
Xi'an Jiaotong-liverpool University
CAM/CAMcomputer graphicsrobotics and AI applications
Xiaohui Zhu
Xiaohui Zhu
Xi'an Jiaotong-Liverpool University
Autonomous navigationRoboticsAI applicationsEnvironment monitoringAIoT
R
Ryan Wen Liu
School of Navigation, Wuhan University of Technology, Wuhan 430063, China, and also with the State Key Laboratory of Maritime Technology and Safety, Wuhan 430063, China