🤖 AI Summary
This work addresses the challenge of achieving robust visual SLAM with thermal imaging under conditions of low texture, low contrast, and high noise. The authors propose a sparse monocular factor graph optimization-based thermal SLAM system that, for the first time, successfully adapts a general-purpose learned feature extraction and matching pipeline—SuperPoint combined with LightGlue—to the thermal domain. Through tailored thermal image preprocessing and module adaptation, the method enhances the suitability of these learned features without requiring fine-tuning on thermal data. Furthermore, keypoint confidence scores are leveraged to construct a weighted factor graph, enabling robust pose estimation. Experimental results on public thermal datasets demonstrate that the proposed system achieves strong cross-domain generalization and practical performance.
📝 Abstract
Thermal imaging provides a practical sensing modality for visual SLAM in visually degraded environments such as low illumination, smoke, or adverse weather. However, thermal imagery often exhibits low texture, low contrast, and high noise, complicating feature-based SLAM. In this work, we propose a sparse monocular graph-based SLAM system for thermal imagery that leverages general-purpose learned features -- the SuperPoint detector and LightGlue matcher, trained on large-scale visible-spectrum data to improve cross-domain generalization. To adapt these components to thermal data, we introduce a preprocessing pipeline to enhance input suitability and modify core SLAM modules to handle sparse and outlier-prone feature matches. We further incorporate keypoint confidence scores from SuperPoint into a confidence-weighted factor graph to improve estimation robustness. Evaluations on public thermal datasets demonstrate that the proposed system achieves reliable performance without requiring dataset-specific training or fine-tuning a desired feature detector, given the scarcity of quality thermal data. Code will be made available upon publication.