Panoptic-CUDAL Technical Report: Rural Australia Point Cloud Dataset in Rainy Conditions

📅 2025-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing autonomous driving datasets suffer from a severe scarcity of panoptic segmentation annotations for rural scenes under rainy conditions, leading to significant degradation in perception performance under adverse weather and unstructured environments. To address this, we introduce the first multimodal (LiDAR + RGB + pose) panoptic segmentation point cloud dataset specifically designed for rainy rural roads, comprising over 100,000 high-resolution point cloud frames with fine-grained semantic and instance labels. Data acquisition employs a mechanical LiDAR synchronized with an RGB camera and integrated GNSS-IMU localization. Annotation leverages a semi-automatic pipeline augmented by cross-modal consistency verification. Benchmarking reveals that state-of-the-art panoptic segmentation models exhibit over 35% mAP degradation on this dataset, confirming its substantial challenge. This work fills a critical gap in fine-grained 3D understanding under adverse weather and rural conditions, providing essential support for robust autonomous driving perception.

Technology Category

Application Category

📝 Abstract
Existing autonomous driving datasets are predominantly oriented towards well-structured urban settings and favorable weather conditions, leaving the complexities of rural environments and adverse weather conditions largely unaddressed. Although some datasets encompass variations in weather and lighting, bad weather scenarios do not appear often. Rainfall can significantly impair sensor functionality, introducing noise and reflections in LiDAR and camera data and reducing the system's capabilities for reliable environmental perception and safe navigation. We introduce the Panoptic-CUDAL dataset, a novel dataset purpose-built for panoptic segmentation in rural areas subject to rain. By recording high-resolution LiDAR, camera, and pose data, Panoptic-CUDAL offers a diverse, information-rich dataset in a challenging scenario. We present analysis of the recorded data and provide baseline results for panoptic and semantic segmentation methods on LiDAR point clouds. The dataset can be found here: https://robotics.sydney.edu.au/our-research/intelligent-transportation-systems/
Problem

Research questions and friction points this paper is trying to address.

Lack of datasets for rural autonomous driving in rain
Rain impairs sensor data, affecting perception and navigation
Need for panoptic segmentation in adverse weather conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

High-resolution LiDAR and camera data
Panoptic segmentation in rural rain
Baseline results for segmentation methods
🔎 Similar Papers
No similar papers found.
T
Tzu-Yun Tseng
The Australian Centre for Field Robotics (ACFR) at the University of Sydney (NSW, Australia)
Alexey Nekrasov
Alexey Nekrasov
RWTH Aachen University
computer vision
M
Malcolm Burdorf
RWTH Aachen University (Aachen, Germany)
Bastian Leibe
Bastian Leibe
Professor for Computer Vision, RWTH Aachen University
Computer VisionObject RecognitionTrackingScene Understanding
J
Julie Stephany Berrio
The Australian Centre for Field Robotics (ACFR) at the University of Sydney (NSW, Australia)
Mao Shan
Mao Shan
Australian Centre for Robotics, The University of Sydney, Australia
RoboticsV2XPerceptionC-ITS
Stewart Worrall
Stewart Worrall
ACFR, University of Sydney
Vehicle automationVehicle localisationSituation awarenessIntelligent transportation systems