Towards Intuitive Human-Robot Interaction through Embodied Gesture-Driven Control with Woven Tactile Skins

πŸ“… 2025-09-30
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To bridge the substantial semantic gap and poor intuitiveness between human intent and robotic response in human-robot interaction, this paper proposes a novel embodied interaction framework based on a capacitive woven tactile skin. The skin employs interwoven conductive yarns to achieve high-density, structurally robust, and conformal sensing on curved surfaces, marking the first integration of such a sensor directly onto a robot’s physical body. Coupled with a lightweight convolutional-Transformer hybrid model, it enables real-time recognition of 14 single- and multi-touch gestures with β‰ˆ100% accuracy. A gesture-to-task-space motion and auxiliary-function mapping strategy is further established. In pick-and-place-and-pour tasks performed by a robotic manipulator, the framework reduces task completion time by up to 57% compared to conventional keyboard and teach pendant interfaces, significantly enhancing interaction naturalness, response efficiency, and operational intuitiveness.

Technology Category

Application Category

πŸ“ Abstract
This paper presents a novel human-robot interaction (HRI) framework that enables intuitive gesture-driven control through a capacitance-based woven tactile skin. Unlike conventional interfaces that rely on panels or handheld devices, the woven tactile skin integrates seamlessly with curved robot surfaces, enabling embodied interaction and narrowing the gap between human intent and robot response. Its woven design combines fabric-like flexibility with structural stability and dense multi-channel sensing through the interlaced conductive threads. Building on this capability, we define a gesture-action mapping of 14 single- and multi-touch gestures that cover representative robot commands, including task-space motion and auxiliary functions. A lightweight convolution-transformer model designed for gesture recognition in real time achieves an accuracy of near-100%, outperforming prior baseline approaches. Experiments on robot arm tasks, including pick-and-place and pouring, demonstrate that our system reduces task completion time by up to 57% compared with keyboard panels and teach pendants. Overall, our proposed framework demonstrates a practical pathway toward more natural and efficient embodied HRI.
Problem

Research questions and friction points this paper is trying to address.

Developing intuitive gesture-driven control for human-robot interaction
Integrating flexible woven tactile skins on curved robot surfaces
Reducing task completion time with real-time gesture recognition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Woven tactile skin enables gesture-driven robot control
Lightweight convolution-transformer model achieves near-perfect recognition
Gesture-action mapping covers 14 touch commands for robots
πŸ”Ž Similar Papers
No similar papers found.
C
ChunPing Lam
Centre for Perceptual and Interactive Intelligence (CPII), Hong Kong SAR, China; Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China
X
Xiangjia Chen
Centre for Perceptual and Interactive Intelligence (CPII), Hong Kong SAR, China; Texense Ltd., Hong Kong SAR, China
Chenming Wu
Chenming Wu
Researcher, Baidu Inc.
RoboticsGraphics3D VisionComputational Design
H
Hao Chen
Centre for Perceptual and Interactive Intelligence (CPII), Hong Kong SAR, China
B
Binzhi Sun
Centre for Perceptual and Interactive Intelligence (CPII), Hong Kong SAR, China; Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China
Guoxin Fang
Guoxin Fang
Assistant Professor, The Chinese University of Hong Kong
Digital ManufacturingComputational DesignRoboticsGeometric Computing
C
Charlie C. L. Wang
Department of Mechanical, Aerospace and Civil Engineering, University of Manchester, United Kingdom
C
Chengkai Dai
Centre for Perceptual and Interactive Intelligence (CPII), Hong Kong SAR, China; Texense Ltd., Hong Kong SAR, China
Y
Yeung Yam
Centre for Perceptual and Interactive Intelligence (CPII), Hong Kong SAR, China; Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Hong Kong SAR, China; Texense Ltd., Hong Kong SAR, China