Raci-Net: Ego-vehicle Odometry Estimation in Adverse Weather Conditions

📅 2025-07-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Visual odometry (VO) suffers significant performance degradation under adverse weather conditions (e.g., rain, snow, low illumination). To address this, we propose a multimodal deep learning odometry framework integrating camera, inertial measurement unit (IMU), and millimeter-wave (mmWave) radar. Uniquely, mmWave radar is elevated to a primary perception modality, and an environment-aware dynamic weight fusion mechanism is introduced to adaptively adjust sensor contributions in real time. During visual degradation, the method explicitly amplifies the roles of radar and IMU, ensuring continuous and robust pose estimation. Extensive experiments on the Boreas dataset demonstrate that our approach achieves high accuracy—reducing average absolute trajectory error (ATE) by 32.7%—and strong robustness across both clear-sky and extreme-weather scenarios, substantially outperforming unimodal baselines and conventional sensor fusion methods.

Technology Category

Application Category

📝 Abstract
Autonomous driving systems are highly dependent on sensors like cameras, LiDAR, and inertial measurement units (IMU) to perceive the environment and estimate their motion. Among these sensors, perception-based sensors are not protected from harsh weather and technical failures. Although existing methods show robustness against common technical issues like rotational misalignment and disconnection, they often degrade when faced with dynamic environmental factors like weather conditions. To address these problems, this research introduces a novel deep learning-based motion estimator that integrates visual, inertial, and millimeter-wave radar data, utilizing each sensor strengths to improve odometry estimation accuracy and reliability under adverse environmental conditions such as snow, rain, and varying light. The proposed model uses advanced sensor fusion techniques that dynamically adjust the contributions of each sensor based on the current environmental condition, with radar compensating for visual sensor limitations in poor visibility. This work explores recent advancements in radar-based odometry and highlights that radar robustness in different weather conditions makes it a valuable component for pose estimation systems, specifically when visual sensors are degraded. Experimental results, conducted on the Boreas dataset, showcase the robustness and effectiveness of the model in both clear and degraded environments.
Problem

Research questions and friction points this paper is trying to address.

Improves ego-vehicle odometry accuracy in adverse weather
Integrates visual, inertial, and radar data for robust estimation
Addresses sensor degradation in snow, rain, and low visibility
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep learning integrates visual, inertial, radar data
Dynamic sensor fusion adjusts based on conditions
Radar compensates visual limitations in poor visibility
🔎 Similar Papers
No similar papers found.
M
Mohammadhossein Talebi
Politecnico di Milano, 20156, Italy
P
Pragyan Dahal
Michigan State University, East Lansing, MI, USA
D
Davide Possenti
Politecnico di Milano, 20156, Italy
Stefano Arrigoni
Stefano Arrigoni
Politecnico di milano
mechanical engineering
F
Francesco Braghin
Politecnico di Milano, 20156, Italy