๐ค AI Summary
The kinematic and visual inconsistency between human and robotic arms poses significant challenges for intuitive teleoperation mapping. To address this, this paper proposes an augmented reality (AR)-enhanced motion mapping method: a motion capture system drives a virtual human arm model in real time, which is then rigidly aligned and superimposed onto the physical robot arm via a head-mounted AR display (e.g., HoloLens). Crucially, we introduce โco-directional AR human arm visualizationโ as an intuitive, real-time motion reference to establish natural pose correspondence between operator and robot. Experimental evaluation demonstrates substantial improvements in teleoperation performance: task completion time decreases by 23%, end-effector positioning error reduces by 31%, and control skill acquisition time shortens by approximately 48%. This work establishes a generalizable, AR-assisted mapping paradigm for teleoperating non-anthropomorphic robotic manipulators.
๐ Abstract
Teleoperating a robot arm involves the human operator positioning the robot's end-effector or programming each joint. Whereas humans can control their own arms easily by integrating visual and proprioceptive feedback, it is challenging to control an external robot arm in the same way, due to its inconsistent orientation and appearance. We explore teleoperating a robot arm through motion-capture (MoCap) of the human operator's arm with the assistance of augmented reality (AR) visualisations. We investigate how AR helps teleoperation by visualising a virtual reference of the human arm alongside the robot arm to help users understand the movement mapping. We found that the AR overlay of a humanoid arm on the robot in the same orientation helped users learn the control. We discuss findings and future work on MoCap-based robot teleoperation.