π€ AI Summary
This work addresses the lack of high-fidelity, reproducible ground-truth motion data in augmented reality (AR) systems, stemming from the biomechanical variability of human movement. To this end, we present ARBotβthe first high-fidelity robotic teleoperation platform that supports natural wrist motion and mobile 6-DOF control. ARBot integrates computer vision and inertial measurement unit (IMU) sensing to capture human actions in real time and employs an active safety-aware quadratic programming (QP) controller to enable smooth, jitter-free robotic replay. We open-source the ARBot platform along with a benchmark dataset comprising 132 trajectories, significantly enhancing the controllability, reproducibility, and scalability of AR interaction evaluation.
π Abstract
Validating Augmented Reality (AR) tracking and interaction models requires precise, repeatable ground-truth motion. However, human users cannot reliably perform consistent motion due to biomechanical variability. Robotic manipulators are promising to act as human motion proxies if they can mimic human movements. In this work, we design and implement ARBot, a real-time teleoperation platform that can effectively capture natural human motion and accurately replay the movements via robotic manipulators. ARBot includes two capture models: stable wrist motion capture via a custom CV and IMU pipeline, and natural 6-DOF control via a mobile application. We design a proactively-safe QP controller to ensure smooth, jitter-free execution of the robotic manipulator, enabling it to function as a high-fidelity record and replay physical proxy. We open-source ARBot and release a benchmark dataset of 132 human and synthetic trajectories captured using ARBot to support controllable and scalable AR evaluation.