🤖 AI Summary
To address limited sign language accessibility for Deaf and Hard-of-Hearing (DHH) individuals, this work proposes an embodied humanoid robot sign language interaction system enabling end-to-end, bidirectional closed-loop communication—from sign input to precise robotic articulation and natural language response. Methodologically, we introduce a novel “cerebellum-inspired motor control” framework synergized with “cortex-guided semantic understanding”: it integrates motion retargeting, learning-based cerebellar-like control policies, and a tightly coupled generative interaction architecture comprising translation, response generation, and sign articulation modules. The system supports cross-platform deployment and generalizes across multiple sign language datasets. Evaluations on both simulation and physical robot platforms demonstrate high-fidelity sign execution and real-time interactive capability, significantly enhancing DHH users’ communication autonomy and accessibility.
📝 Abstract
Sign language is a natural and visual form of language that uses movements and expressions to convey meaning, serving as a crucial means of communication for individuals who are deaf or hard-of-hearing (DHH). However, the number of people proficient in sign language remains limited, highlighting the need for technological advancements to bridge communication gaps and foster interactions with minorities. Based on recent advancements in embodied humanoid robots, we propose SignBot, a novel framework for human-robot sign language interaction. SignBot integrates a cerebellum-inspired motion control component and a cerebral-oriented module for comprehension and interaction. Specifically, SignBot consists of: 1) Motion Retargeting, which converts human sign language datasets into robot-compatible kinematics; 2) Motion Control, which leverages a learning-based paradigm to develop a robust humanoid control policy for tracking sign language gestures; and 3) Generative Interaction, which incorporates translator, responser, and generator of sign language, thereby enabling natural and effective communication between robots and humans. Simulation and real-world experimental results demonstrate that SignBot can effectively facilitate human-robot interaction and perform sign language motions with diverse robots and datasets. SignBot represents a significant advancement in automatic sign language interaction on embodied humanoid robot platforms, providing a promising solution to improve communication accessibility for the DHH community.