Enhancing Feature Tracking Reliability for Visual Navigation using Real-Time Safety Filter

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In GPS-denied visual navigation, unreliable feature tracking—due to occlusion or rapid motion—often leads to pose estimation degradation. To address this, this work introduces, for the first time, a kinematic invariance set formulation for feature visibility constraints and designs a real-time safety filter based on quadratic programming (QP) that minimizes deviation while enforcing persistent observability of a sufficient number of features. The method unifies robot kinematics, visual SLAM, and information-entropy-driven visibility metrics to jointly optimize perceptual robustness and control accuracy. Simulation results verify constraint invariance, while real-world SLAM experiments demonstrate significant improvements in pose estimation accuracy under challenging conditions—including texture-poor environments and high-speed motion—outperforming existing baseline controllers.

Technology Category

Application Category

📝 Abstract
Vision sensors are extensively used for localizing a robot's pose, particularly in environments where global localization tools such as GPS or motion capture systems are unavailable. In many visual navigation systems, localization is achieved by detecting and tracking visual features or landmarks, which provide information about the sensor's relative pose. For reliable feature tracking and accurate pose estimation, it is crucial to maintain visibility of a sufficient number of features. This requirement can sometimes conflict with the robot's overall task objective. In this paper, we approach it as a constrained control problem. By leveraging the invariance properties of visibility constraints within the robot's kinematic model, we propose a real-time safety filter based on quadratic programming. This filter takes a reference velocity command as input and produces a modified velocity that minimally deviates from the reference while ensuring the information score from the currently visible features remains above a user-specified threshold. Numerical simulations demonstrate that the proposed safety filter preserves the invariance condition and ensures the visibility of more features than the required minimum. We also validated its real-world performance by integrating it into a visual simultaneous localization and mapping (SLAM) algorithm, where it maintained high estimation quality in challenging environments, outperforming a simple tracking controller.
Problem

Research questions and friction points this paper is trying to address.

Visual Navigation
Object Tracking
Robot Localization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Real-time Filtering
Visual Navigation
Object Tracking
🔎 Similar Papers
No similar papers found.
D
Dabin Kim
Department of Aerospace Engineering, Seoul National University, Seoul 08826, South Korea
Inkyu Jang
Inkyu Jang
Seoul National University
Robotics
Y
Youngsoo Han
Department of Aerospace Engineering, Seoul National University, Seoul 08826, South Korea
Sunwoo Hwang
Sunwoo Hwang
PhD Student, Seoul National University
Aerial RobotsRoboticsControl TheorySafety-critical control
H. Jin Kim
H. Jin Kim
Seoul National University, South Korea
roboticsdronesintelligent control