🤖 AI Summary
Existing robotic tactile interfaces struggle to convey complex emotions with precision.
Method: We propose an LLM-driven wearable vibrotactile sleeve featuring a 5×5 actuator array, the first to directly leverage a large language model (ChatGPT) for generating semantically grounded, emotion-discriminative sparse tactile sequences. Through chain-of-thought prompting, we map ten emotions onto six tactile gestures, encoding each as a 10-second vibration pattern; emotional fidelity is quantitatively evaluated along valence–arousal dimensions.
Contribution/Results: A user study with 32 participants demonstrates significantly higher emotion recognition accuracy versus baselines (p < 0.01), validating both the effectiveness and interpretability of LLM-generated tactile signals. This work establishes the first end-to-end “language-to-tactile” affective encoding paradigm, offering a novel methodological framework and scalable technical pathway for emotion-aware embodied intelligence.
📝 Abstract
Touch is a fundamental aspect of emotion-rich communication, playing a vital role in human interaction and offering significant potential in human-robot interaction. Previous research has demonstrated that a sparse representation of human touch can effectively convey social tactile signals. However, advances in human-robot tactile interaction remain limited, as many humanoid robots possess simplistic capabilities, such as only opening and closing their hands, restricting nuanced tactile expressions. In this study, we explore how a robot can use sparse representations of tactile vibrations to convey emotions to a person. To achieve this, we developed a wearable sleeve integrated with a 5x5 grid of vibration motors, enabling the robot to communicate diverse tactile emotions and gestures. Using chain prompts within a Large Language Model (LLM), we generated distinct 10-second vibration patterns corresponding to 10 emotions (e.g., happiness, sadness, fear) and 6 touch gestures (e.g., pat, rub, tap). Participants (N = 32) then rated each vibration stimulus based on perceived valence and arousal. People are accurate at recognising intended emotions, a result which aligns with earlier findings. These results highlight the LLM's ability to generate emotional haptic data and effectively convey emotions through tactile signals. By translating complex emotional and tactile expressions into vibratory patterns, this research demonstrates how LLMs can enhance physical interaction between humans and robots.