GPR-OdomNet: Difference and Similarity-Driven Odometry Estimation Network for Ground Penetrating Radar-Based Localization

πŸ“… 2025-11-21
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In challenging environments, ground-penetrating radar (GPR)-based robot localization suffers from low displacement estimation accuracy due to subtle variations in consecutive B-scan images. To address this, we propose a deep neural network that jointly models multi-scale feature differences and similarities. Our method introduces, for the first time, a multi-scale feature contrast mechanism: a custom network extracts hierarchical representations from sequential B-scan images, and both dissimilarity and similarity cues are fused to regress Euclidean displacement distances. Evaluated on the CMU-GPR dataset, our approach achieves a weighted RMSE of 0.449 mβ€”improving upon the current state-of-the-art by 10.2%. It significantly enhances robustness and accuracy for fine-grained displacements, particularly under adverse conditions. This work establishes a new paradigm for all-weather, GPR-based localization.

Technology Category

Application Category

πŸ“ Abstract
When performing robot/vehicle localization using ground penetrating radar (GPR) to handle adverse weather and environmental conditions, existing techniques often struggle to accurately estimate distances when processing B-scan images with minor distinctions. This study introduces a new neural network-based odometry method that leverages the similarity and difference features of GPR B-scan images for precise estimation of the Euclidean distances traveled between the B-scan images. The new custom neural network extracts multi-scale features from B-scan images taken at consecutive moments and then determines the Euclidean distance traveled by analyzing the similarities and differences between these features. To evaluate our method, an ablation study and comparison experiments have been conducted using the publicly available CMU-GPR dataset. The experimental results show that our method consistently outperforms state-of-the-art counterparts in all tests. Specifically, our method achieves a root mean square error (RMSE), and achieves an overall weighted RMSE of 0.449 m across all data sets, which is a 10.2% reduction in RMSE when compared to the best state-of-the-art method.
Problem

Research questions and friction points this paper is trying to address.

Estimating precise robot distances from GPR B-scans with minor distinctions
Leveraging similarity and difference features in GPR images for odometry
Improving localization accuracy in adverse weather using neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural network extracts multi-scale GPR B-scan features
Analyzes feature similarities and differences for distance
Estimates Euclidean distance traveled between consecutive B-scans
πŸ”Ž Similar Papers
No similar papers found.
H
Huaichao Wang
the Department of Computer Science, Civil Aviation University of China, Tianjin, China
X
Xuanxin Fan
the Department of Computer Science, Civil Aviation University of China, Tianjin, China
J
Ji Liu
Chengdu Textile College, Chengdu, China
Dezhen Song
Dezhen Song
Professor of MBZUAI
Robot perceptionrobot navigationsensor fusionnetworked robotsautomation
Haifeng Li
Haifeng Li
Central South University
GISRemote sensingMachine learningSparse represetationBrain Theory