TruckDrive: Long-Range Autonomous Highway Driving Dataset

πŸ“… 2026-03-02
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This study addresses the long-range perception challenges faced by autonomous heavy-duty trucks on highways, where sensing beyond 100 meters is critical yet underexplored, as existing datasets predominantly focus on short-range urban scenarios. To bridge this gap, we present the first multimodal, long-range autonomous driving dataset specifically tailored for highway environments and heavy-duty trucks. The platform integrates seven long-range FMCW LiDARs, eleven 8MP multi-focal-length cameras, and ten 4D millimeter-wave radars, achieving a perception range of up to 1,000 meters. It provides 475,000 samples, including 165,000 densely annotated frames, supporting diverse tasks such as 2D/3D object detection, depth estimation, tracking, motion planning, and end-to-end driving. Benchmarking experiments reveal that state-of-the-art models suffer a performance drop of 31%–99% in 3D perception beyond 150 meters, underscoring the critical need for dedicated long-range benchmarks and highlighting the dataset’s role in establishing a foundational reference for this domain.

Technology Category

Application Category

πŸ“ Abstract
Safe highway autonomy for heavy trucks remains an open and unsolved challenge: due to long braking distances, scene understanding of hundreds of meters is required for anticipatory planning and to allow safe braking margins. However, existing driving datasets primarily cover urban scenes, with perception effectively limited to short ranges of only up to 100 meters. To address this gap, we introduce TruckDrive, a highway-scale multimodal driving dataset, captured with a sensor suite purpose-built for long range sensing: seven long-range FMCW LiDARs measuring range and radial velocity, three high-resolution short-range LiDARs, eleven 8MP surround cameras with varying focal lengths and ten 4D FMCW radars. The dataset offers 475 thousands samples with 165 thousands densely annotated frames for driving perception benchmarking up to 1,000 meters for 2D detection and 400 meters for 3D detection, depth estimation, tracking, planning and end to end driving over 20 seconds sequences at highway speeds. We find that state-of-the-art autonomous driving models do not generalize to ranges beyond 150 meters, with drops between 31% and 99% in 3D perception tasks, exposing a systematic long-range gap that current architectures and training signals cannot close.
Problem

Research questions and friction points this paper is trying to address.

long-range perception
autonomous driving
heavy trucks
highway autonomy
driving dataset
Innovation

Methods, ideas, or system contributions that make the work stand out.

long-range perception
FMCW LiDAR
autonomous trucking
highway driving dataset
multimodal sensor suite
πŸ”Ž Similar Papers
No similar papers found.