🤖 AI Summary
This work addresses the challenges of high-dimensional control and high equipment costs in mobile dual-arm teleoperation by proposing a low-cost, open-source whole-body teleoperation system. The system leverages the inertial measurement unit (IMU) of a standard smartphone for head tracking to control the camera viewpoint, delivering immersive visual feedback. Dual-arm manipulation is achieved through a master-slave robotic arm setup, while hands-free base navigation is enabled via foot pedals. Built upon the XLeRobot framework, the system features a modular architecture. User studies demonstrate that, compared to conventional keyboard-based control, the proposed approach significantly improves task efficiency, reduces cognitive load, and substantially lowers the barrier to entry for teleoperation.
📝 Abstract
Teleoperation of mobile bimanual manipulators requires simultaneous control of high-dimensional systems, often necessitating expensive specialized equipment. We present an open-source teleoperation framework that enables intuitive whole body control using readily available commodity hardware. Our system combines smartphone-based head tracking for camera control, leader arms for bilateral manipulation, and foot pedals for hands-free base navigation. Using a standard smartphone with IMU and display, we eliminate the need for costly VR helmets while maintaining immersive visual feedback. The modular architecture integrates seamlessly with the XLeRobot framework, but can be easily adapted to other types of mobile manipulators. We validate our approach through user studies that demonstrate improved task performance and reduced cognitive load compared to keyboard-based control.