A Virtual Mechanical Interaction Layer Enables Resilient Human-to-Robot Object Handovers

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address insufficient robustness in human–robot object handover caused by dynamic object pose changes, this paper proposes an interaction-layer framework integrating Virtual Model Control (VMC) and Augmented Reality (AR). Methodologically, it establishes an impedance-regulated virtual mechanical interaction layer to compensate for pose disturbances in real time, while leveraging AR for bidirectional motion guidance and state visualization—enhancing naturalness and interpretability of human–robot collaboration. The key contribution is the first deep coupling of VMC and AR into a closed-loop “perception–modeling–feedback” architecture, enabling compliant adaptation and real-time interaction in hand-to-hand scenarios. Experiments demonstrate a 27% improvement in success rate under pose discontinuities and grasp uncertainty. A user study (N=24) confirms statistically significant superiority over baseline methods (p<0.01), identifying coordinated optimization of visual cue density and impedance parameters as a critical pathway for enhancing user experience.

Technology Category

Application Category

📝 Abstract
Object handover is a common form of interaction that is widely present in collaborative tasks. However, achieving it efficiently remains a challenge. We address the problem of ensuring resilient robotic actions that can adapt to complex changes in object pose during human-to-robot object handovers. We propose the use of Virtual Model Control to create an interaction layer that controls the robot and adapts to the dynamic changes in the handover process. Additionally, we propose the use of augmented reality to facilitate bidirectional communication between humans and robots during handovers. We assess the performance of our controller in a set of experiments that demonstrate its resilience to various sources of uncertainties, including complex changes to the object's pose during the handover. Finally, we performed a user study with 16 participants to understand human preferences for different robot control profiles and augmented reality visuals in object handovers. Our results showed a general preference for the proposed approach and revealed insights that can guide further development in adapting the interaction with the user.
Problem

Research questions and friction points this paper is trying to address.

Ensuring resilient robotic actions during human-to-robot object handovers
Adapting to complex changes in object pose dynamically
Facilitating bidirectional communication between humans and robots
Innovation

Methods, ideas, or system contributions that make the work stand out.

Virtual Model Control creates adaptive robot interaction layer
Augmented reality enables bidirectional human-robot communication
Controller demonstrates resilience to object pose uncertainties
🔎 Similar Papers
No similar papers found.
O
Omar Faris
Department of Engineering, University of Cambridge, UK
S
Sławomir Tadeja
Department of Mechanical Engineering, Massachusetts Institute of Technology
Fulvio Forni
Fulvio Forni
University of Cambridge
Control