Thermal-LiDAR Fusion for Robust Tunnel Localization in GNSS-Denied and Low-Visibility Conditions

📅 2025-05-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In GNSS-denied, low-illumination, texture-poor, and smoke-affected environments—such as tunnels—visual and LiDAR SLAM often fail due to insufficient distinctive features. To address this, this paper proposes a robust tightly coupled thermal-LiDAR SLAM framework. It introduces thermal radiometric contrast as a novel structural cue, synergistically fusing thermal visual odometry with LiDAR-SLAM via an extended Kalman filter for multi-sensor co-optimization. The method overcomes localization failures in repetitive-structure and low-visibility scenarios, achieving centimeter-level accuracy and near-100% continuous localization success in real-world tunnel experiments. Compared to unimodal baselines, its robustness improves by over threefold, significantly outperforming state-of-the-art GNSS-, vision-, and LiDAR-only approaches.

Technology Category

Application Category

📝 Abstract
Despite significant progress in autonomous navigation, a critical gap remains in ensuring reliable localization in hazardous environments such as tunnels, urban disaster zones, and underground structures. Tunnels present a uniquely difficult scenario: they are not only prone to GNSS signal loss, but also provide little features for visual localization due to their repetitive walls and poor lighting. These conditions degrade conventional vision-based and LiDAR-based systems, which rely on distinguishable environmental features. To address this, we propose a novel sensor fusion framework that integrates a thermal camera with a LiDAR to enable robust localization in tunnels and other perceptually degraded environments. The thermal camera provides resilience in low-light or smoke conditions, while the LiDAR delivers precise depth perception and structural awareness. By combining these sensors, our framework ensures continuous and accurate localization across diverse and dynamic environments. We use an Extended Kalman Filter (EKF) to fuse multi-sensor inputs, and leverages visual odometry and SLAM (Simultaneous Localization and Mapping) techniques to process the sensor data, enabling robust motion estimation and mapping even in GNSS-denied environments. This fusion of sensor modalities not only enhances system resilience but also provides a scalable solution for cyber-physical systems in connected and autonomous vehicles (CAVs). To validate the framework, we conduct tests in a tunnel environment, simulating sensor degradation and visibility challenges. The results demonstrate that our method sustains accurate localization where standard approaches deteriorate due to the tunnels featureless geometry. The frameworks versatility makes it a promising solution for autonomous vehicles, inspection robots, and other cyber-physical systems operating in constrained, perceptually poor environments.
Problem

Research questions and friction points this paper is trying to address.

Ensures reliable localization in GNSS-denied, low-visibility tunnels
Integrates thermal-LiDAR fusion to overcome featureless tunnel environments
Enhances autonomous navigation in hazardous, perceptually degraded conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Thermal-LiDAR fusion for robust tunnel localization
EKF-based multi-sensor fusion for motion estimation
Combines thermal resilience with LiDAR precision
🔎 Similar Papers
No similar papers found.
L
Lukas Schichler
Virtual Vehicle Research GmbH, Graz, Austria
K
Karin Festl
Virtual Vehicle Research GmbH, Graz, Austria
S
Selim Solmaz
Virtual Vehicle Research GmbH, Graz, Austria
Daniel Watzenig
Daniel Watzenig
Graz University of Technology