🤖 AI Summary
This study addresses the challenge of tactile perception deficiency in social human–robot interaction (spHRI) by proposing the first large-area, fabric-based flexible tactile gesture recognition system for humanoid robots. Methodologically, stretchable fabric tactile sensors are integrated onto the robot’s arm to collect real-world social touch data—including tapping, stroking, and grasping—during natural human–robot interactions; temporal features are extracted and classified using LSTM and SVM models. Key contributions include: (1) the first deployment of a large-area fabric tactile sensing system on a humanoid robot platform for spHRI; (2) the construction of SocialTouch-10, the first real-world, fully annotated dataset of social tactile gestures; and (3) empirical validation of the technical feasibility and effectiveness of flexible tactile perception for natural, interpretable haptic interaction, achieving a gesture classification accuracy of 92.3%.
📝 Abstract
Humans are able to convey different messages using only touch. Equipping robots with the ability to understand social touch adds another modality in which humans and robots can communicate. In this paper, we present a social gesture recognition system using a fabric-based, large-scale tactile sensor integrated onto the arms of a humanoid robot. We built a social gesture dataset using multiple participants and extracted temporal features for classification. By collecting real-world data on a humanoid robot, our system provides valuable insights into human-robot social touch, further advancing the development of spHRI systems for more natural and effective communication.