🤖 AI Summary
This work addresses the degradation of visual-inertial odometry (VIO) performance in visually sparse environments, where autonomous exploration often leads to localization drift and mapping failure. To mitigate this, the authors propose a hierarchical perception-aware exploration framework that explicitly integrates feature quality assessment and continuous yaw trajectory optimization into the exploration strategy. Specifically, a global feature map is constructed to prioritize frontier candidates based on subgoal desirability, while continuous yaw motion is optimized to maintain stable visual feature tracking. Experimental results in both simulated and real-world low-texture environments demonstrate that the proposed method significantly enhances feature tracking stability, reduces odometry drift, and improves exploration coverage efficiency by an average of 30% compared to baseline approaches.
📝 Abstract
Autonomous exploration in unknown environments typically relies on onboard state estimation for localisation and mapping. Existing exploration methods primarily maximise coverage efficiency, but often overlook that visual-inertial odometry (VIO) performance strongly depends on the availability of robust visual features. As a result, exploration policies can drive a robot into feature-sparse regions where tracking degrades, leading to odometry drift, corrupted maps, and mission failure. We propose a hierarchical perception-aware exploration framework for a stereo-equipped unmanned aerial vehicle (UAV) that explicitly couples exploration progress with feature observability. Our approach (i) associates each candidate frontier with an expected feature quality using a global feature map, and prioritises visually informative subgoals, and (ii) optimises a continuous yaw trajectory along the planned motion to maintain stable feature tracks. We evaluate our method in simulation across environments with varying texture levels and in real-world indoor experiments with largely textureless walls. Compared to baselines that ignore feature quality and/or do not optimise continuous yaw, our method maintains more reliable feature tracking, reduces odometry drift, and achieves on average 30\% higher coverage before the odometry error exceeds specified thresholds.