🤖 AI Summary
Particle filters often suffer from premature convergence and struggle to balance diversity with estimation compactness in multimodal environments. To address this, we propose a linear-time diversity maintenance method grounded in the topological structure of the ancestor tree. By implicitly measuring particle similarity through the ancestor tree, our approach enables clustering without requiring spatial or domain-specific priors. We further introduce intra-cluster fitness sharing and non-cluster particle preservation to suppress degeneracy while retaining multimodal structure. Compared to baseline methods—including deterministic resampling and particle-based Gaussian mixture models—our method achieves significantly higher success rates in both synthetic simulations and real-world indoor robot localization tasks, with negligible degradation in estimation compactness. Experimental results demonstrate superior robustness and generalization across diverse multimodal scenarios.
📝 Abstract
We propose a method for linear-time diversity maintenance in particle filtering. It clusters particles based on ancestry tree topology: closely related particles in sufficiently large subtrees are grouped together. The main idea is that the tree structure implicitly encodes similarity without the need for spatial or other domain-specific metrics. This approach, when combined with intra-cluster fitness sharing and the protection of particles not included in a cluster, effectively prevents premature convergence in multimodal environments while maintaining estimate compactness. We validate our approach in a multimodal robotics simulation and a real-world multimodal indoor environment. We compare the performance to several diversity maintenance algorithms from the literature, including Deterministic Resampling and Particle Gaussian Mixtures. Our algorithm achieves high success rates with little to no negative effect on compactness, showing particular robustness to different domains and challenging initial conditions.