🤖 AI Summary
This work addresses the instability of handcrafted 3D place recognition descriptors under small LiDAR pose translations by proposing a learning-free bird’s-eye-view (BEV) descriptor. The method models grid cells as Bernoulli random variables and, for the first time, analytically marginalizes continuous translation uncertainty: it achieves translation robustness via a polar-coordinate Jacobian and introduces a distance-adaptive angular uncertainty model that relies solely on physically interpretable translation uncertainty parameters, enabling cross-sensor generalization. Combined with a Bernoulli–KL Jaccard similarity measure, exponential uncertainty gating, and FFT-accelerated circular correlation matching, the approach significantly outperforms existing handcrafted descriptors across four multi-session, multi-LiDAR datasets and matches the performance of both supervised and unsupervised learning-based methods in single-session tasks.
📝 Abstract
We present PROBE (PRobabilistic Occupancy BEV Encoding), a learning-free LiDAR place recognition descriptor that models each BEV cell's occupancy as a Bernoulli random variable. Rather than relying on discrete point-cloud perturbations, PROBE analytically marginalizes over continuous Cartesian translations via the polar Jacobian, yielding a distance-adaptive angular uncertainty $\sigma_\theta = \sigma_t / r$ in $\mathcal{O}(R \times S)$ time. The primary parameter $\sigma_t$ represents the expected translational uncertainty in meters, a sensor-independent physical quantity allowing cross-sensor generalization without per-dataset tuning. Pairwise similarity combines a Bernoulli-KL Jaccard with exponential uncertainty gating and FFT-based height cosine similarity for rotation alignment. Evaluated on four datasets spanning four diverse LiDAR types, PROBE achieves the highest accuracy among handcrafted descriptors in multi-session evaluation and competitive single-session performance against both handcrafted and supervised baselines. The source code and supplementary materials are available at https://sites.google.com/view/probe-pr.