Perception-Aware Autonomous Exploration in Feature-Limited Environments

📅 2026-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the degradation of visual-inertial odometry (VIO) performance in visually sparse environments, where autonomous exploration often leads to localization drift and mapping failure. To mitigate this, the authors propose a hierarchical perception-aware exploration framework that explicitly integrates feature quality assessment and continuous yaw trajectory optimization into the exploration strategy. Specifically, a global feature map is constructed to prioritize frontier candidates based on subgoal desirability, while continuous yaw motion is optimized to maintain stable visual feature tracking. Experimental results in both simulated and real-world low-texture environments demonstrate that the proposed method significantly enhances feature tracking stability, reduces odometry drift, and improves exploration coverage efficiency by an average of 30% compared to baseline approaches.

Technology Category

Application Category

📝 Abstract
Autonomous exploration in unknown environments typically relies on onboard state estimation for localisation and mapping. Existing exploration methods primarily maximise coverage efficiency, but often overlook that visual-inertial odometry (VIO) performance strongly depends on the availability of robust visual features. As a result, exploration policies can drive a robot into feature-sparse regions where tracking degrades, leading to odometry drift, corrupted maps, and mission failure. We propose a hierarchical perception-aware exploration framework for a stereo-equipped unmanned aerial vehicle (UAV) that explicitly couples exploration progress with feature observability. Our approach (i) associates each candidate frontier with an expected feature quality using a global feature map, and prioritises visually informative subgoals, and (ii) optimises a continuous yaw trajectory along the planned motion to maintain stable feature tracks. We evaluate our method in simulation across environments with varying texture levels and in real-world indoor experiments with largely textureless walls. Compared to baselines that ignore feature quality and/or do not optimise continuous yaw, our method maintains more reliable feature tracking, reduces odometry drift, and achieves on average 30\% higher coverage before the odometry error exceeds specified thresholds.
Problem

Research questions and friction points this paper is trying to address.

autonomous exploration
visual-inertial odometry
feature-limited environments
perception-aware
odometry drift
Innovation

Methods, ideas, or system contributions that make the work stand out.

perception-aware exploration
visual-inertial odometry
feature observability
autonomous UAV
frontier-based exploration
🔎 Similar Papers
No similar papers found.
Moji Shi
Moji Shi
Delft University of Technology, Shanghai Jiao Tong University
motion planningreinforcement learningfinite element analysis
R
Rajitha de Silva
Lincoln Center for Autonomous Systems (L-CAS), School of Engineering and Physical Sciences, University of Lincoln, UK
H
Hang Yu
MAVLab, TU Delft, Netherlands
R
Riccardo Polvara
Lincoln Center for Autonomous Systems (L-CAS), School of Engineering and Physical Sciences, University of Lincoln, UK
Marija Popović
Marija Popović
Assistant Professor, TU Delft
Artificial IntelligenceRoboticsActive SensingRobot LearningPerception