Correlation-Aware Dual-View Pose and Velocity Estimation for Dynamic Robotic Manipulation

📅 2025-10-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Accurate and robust joint estimation of target pose and velocity remains challenging in dynamic robotic manipulation under dual-camera configurations (eye-in-hand and eye-to-hand), particularly under occlusion and rapid motion. Method: This paper proposes a decentralized dual-view fusion estimation algorithm. We formulate a correlation-aware dual-filter framework on the Lie group manifold SE(3) × ℝ³ × ℝ³, integrating a stochastic acceleration motion model with an adaptive extended Kalman filter to achieve tightly coupled estimation of pose, linear velocity, and angular velocity. The approach avoids linearization errors inherent in Euclidean-space formulations and explicitly models inter-state correlations. Contribution/Results: Experiments on the UFactory xArm 850 platform demonstrate that our method reduces pose estimation error by 32.7% and velocity estimation error by 28.4% compared to state-of-the-art methods, while maintaining robustness under severe occlusion and high-speed target motion.

Technology Category

Application Category

📝 Abstract
Accurate pose and velocity estimation is essential for effective spatial task planning in robotic manipulators. While centralized sensor fusion has traditionally been used to improve pose estimation accuracy, this paper presents a novel decentralized fusion approach to estimate both pose and velocity. We use dual-view measurements from an eye-in-hand and an eye-to-hand vision sensor configuration mounted on a manipulator to track a target object whose motion is modeled as random walk (stochastic acceleration model). The robot runs two independent adaptive extended Kalman filters formulated on a matrix Lie group, developed as part of this work. These filters predict poses and velocities on the manifold $mathbb{SE}(3) imes mathbb{R}^3 imes mathbb{R}^3$ and update the state on the manifold $mathbb{SE}(3)$. The final fused state comprising the fused pose and velocities of the target is obtained using a correlation-aware fusion rule on Lie groups. The proposed method is evaluated on a UFactory xArm 850 equipped with Intel RealSense cameras, tracking a moving target. Experimental results validate the effectiveness and robustness of the proposed decentralized dual-view estimation framework, showing consistent improvements over state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Decentralized fusion for robotic pose and velocity estimation
Tracking moving targets using dual-view vision sensor configuration
Correlation-aware fusion on Lie groups for improved estimation accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decentralized fusion approach for pose and velocity estimation
Dual-view adaptive extended Kalman filters on Lie groups
Correlation-aware fusion rule on Lie groups for state fusion
🔎 Similar Papers
No similar papers found.