Self-Supervised Traversability Learning with Online Prototype Adaptation for Off-Road Autonomous Driving

πŸ“… 2025-04-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Off-road autonomous driving demands real-time, robust terrain traversability analysis, yet faces critical challenges: absence of large-scale annotated datasets, severe onboard computational constraints, and stringent latency requirements (β‰₯10 Hz). To address these, we propose the first self-supervised Bird’s-Eye-View (BEV) traversability assessment framework tailored forι‡Žε€– environments. It pioneers the integration of BEV representation into self-supervised traversability learning; introduces a lightweight network architecture coupled with an online prototype clustering mechanism to enable dynamic, in-motion environment representation updates; and requires no manual annotations, producing traversability cost maps end-to-end. Evaluated on a diverse, real-world dataset spanning multiple seasons and geographic regions, our method significantly outperforms state-of-the-art approaches. Onboard deployment achieves 10 Hz inference throughput. A 5.5 km field experiment demonstrates that the generated cost maps directly enable downstream motion planning without further adaptation.

Technology Category

Application Category

πŸ“ Abstract
Achieving reliable and safe autonomous driving in off-road environments requires accurate and efficient terrain traversability analysis. However, this task faces several challenges, including the scarcity of large-scale datasets tailored for off-road scenarios, the high cost and potential errors of manual annotation, the stringent real-time requirements of motion planning, and the limited computational power of onboard units. To address these challenges, this paper proposes a novel traversability learning method that leverages self-supervised learning, eliminating the need for manual annotation. For the first time, a Birds-Eye View (BEV) representation is used as input, reducing computational burden and improving adaptability to downstream motion planning. During vehicle operation, the proposed method conducts online analysis of traversed regions and dynamically updates prototypes to adaptively assess the traversability of the current environment, effectively handling dynamic scene changes. We evaluate our approach against state-of-the-art benchmarks on both public datasets and our own dataset, covering diverse seasons and geographical locations. Experimental results demonstrate that our method significantly outperforms recent approaches. Additionally, real-world vehicle experiments show that our method operates at 10 Hz, meeting real-time requirements, while a 5.5 km autonomous driving experiment further validates the generated traversability cost maps compatibility with downstream motion planning.
Problem

Research questions and friction points this paper is trying to address.

Self-supervised traversability learning for off-road autonomous driving
Online prototype adaptation to handle dynamic scene changes
BEV representation reduces computation and aids motion planning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-supervised learning eliminates manual annotation
Birds-Eye View input reduces computational burden
Online prototype adaptation handles dynamic scenes
πŸ”Ž Similar Papers
No similar papers found.
Y
Yafeng Bu
College of Intelligence Science and Technology, National University of Defense Technology
Z
Zhenping Sun
College of Intelligence Science and Technology, National University of Defense Technology
X
Xiaohui Li
College of Intelligence Science and Technology, National University of Defense Technology
Jun Zeng
Jun Zeng
University of California, Berkeley
Robotics
X
Xin Zhang
College of Intelligence Science and Technology, National University of Defense Technology
H
Hui Shen
College of Intelligence Science and Technology, National University of Defense Technology