Augmented Reality-Enhanced Robot Teleoperation for Collecting User Demonstrations

📅 2025-09-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional industrial robot programming relies heavily on expert knowledge, entails long development cycles, and imposes high entry barriers. While existing programming-by-demonstration (PbD) approaches reduce complexity, their teaching interfaces remain unintuitive and lack sufficient safety guarantees. To address these limitations, this paper proposes an augmented reality (AR)-enhanced teleoperation framework leveraging real-time point cloud rendering for contactless, spatially aligned, and intuitive demonstration capture. The framework integrates environment perception with immersive 3D spatial visualization and supports natural gesture-based interaction, while maintaining compatibility with ABB IRB 1200 and GoFa 5 robotic platforms. Experimental evaluation demonstrates a 28% improvement in task accuracy and execution efficiency, alongside a 12% increase in system usability (measured via the System Usability Scale, SUS). Our key contribution lies in the deep coupling of AR and point-cloud-driven teleoperation—significantly enhancing teaching programming efficiency, precision, and human–robot collaborative safety.

Technology Category

Application Category

📝 Abstract
Traditional industrial robot programming is often complex and time-consuming, typically requiring weeks or even months of effort from expert programmers. Although Programming by Demonstration (PbD) offers a more accessible alternative, intuitive interfaces for robot control and demonstration collection remain challenging. To address this, we propose an Augmented Reality (AR)-enhanced robot teleoperation system that integrates AR-based control with spatial point cloud rendering, enabling intuitive, contact-free demonstrations. This approach allows operators to control robots remotely without entering the workspace or using conventional tools like the teach pendant. The proposed system is generally applicable and has been demonstrated on ABB robot platforms, specifically validated with the IRB 1200 industrial robot and the GoFa 5 collaborative robot. A user study evaluates the impact of real-time environmental perception, specifically with and without point cloud rendering, on task completion accuracy, efficiency, and user confidence. Results indicate that enhanced perception significantly improves task performance by 28% and enhances user experience, as reflected by a 12% increase in the System Usability Scale (SUS) score. This work contributes to the advancement of intuitive robot teleoperation, AR interface design, environmental perception, and teleoperation safety mechanisms in industrial settings for demonstration collection. The collected demonstrations may serve as valuable training data for machine learning applications.
Problem

Research questions and friction points this paper is trying to address.

Intuitive robot teleoperation for demonstration collection
AR-enhanced control without physical workspace entry
Real-time environmental perception impact evaluation
Innovation

Methods, ideas, or system contributions that make the work stand out.

AR-enhanced teleoperation with point cloud rendering
Contact-free robot control without teach pendant
Real-time environmental perception improves performance
🔎 Similar Papers
No similar papers found.
Shiqi Gong
Shiqi Gong
Beijing Institute of Technology
Wireless Communications
S
Sebastian Zudaire
ABB Corporate Research
C
Chi Zhang
ABB Corporate Research
Z
Zhen Li
ABB Corporate Research