🤖 AI Summary
Imitation learning for robotics faces a critical bottleneck in acquiring high-fidelity demonstration data across heterogeneous platforms, primarily due to the poor cross-device compatibility and limited precision of existing teleoperation systems. To address this, we propose TripleAny—a hardware-agnostic unified teleoperation framework—built upon a three-layer decoupled architecture: (1) a hardware abstraction layer for device interoperability; (2) real-time pose estimation and inverse kinematics solving; and (3) an optimization-based gesture pose retargeting algorithm. TripleAny enables seamless, high-accuracy mapping among arbitrary robotic arms, dexterous hands, and diverse input devices. Extensive evaluation across multiple robot platforms demonstrates significantly improved cross-platform data fidelity and generalization capability, enabling robust generation of high-quality demonstrations for complex dexterous manipulation. The complete system is open-sourced, providing a scalable foundational infrastructure for both imitation learning research and industrial deployment.
📝 Abstract
Accurate and high-fidelity demonstration data acquisition is a critical bottleneck for deploying robot Imitation Learning (IL) systems, particularly when dealing with heterogeneous robotic platforms. Existing teleoperation systems often fail to guarantee high-precision data collection across diverse types of teleoperation devices. To address this, we developed Open TeleDex, a unified teleoperation framework engineered for demonstration data collection. Open TeleDex specifically tackles the TripleAny challenge, seamlessly supporting any robotic arm, any dexterous hand, and any external input device. Furthermore, we propose a novel hand pose retargeting algorithm that significantly boosts the interoperability of Open TeleDex, enabling robust and accurate compatibility with an even wider spectrum of heterogeneous master and slave equipment. Open TeleDex establishes a foundational, high-quality, and publicly available platform for accelerating both academic research and industry development in complex robotic manipulation and IL.