🤖 AI Summary
This study addresses the challenge of whole-body contact localization using only intrinsic proprioceptive joint sensors—without additional tactile hardware. We propose the first end-to-end data-driven framework that fuses temporal sequences of joint torques, positions, and velocities with dynamics-informed features, processed by a lightweight deep learning model for real-time tactile perception. The method requires no hardware modification and is plug-and-play. Evaluated on commercial platforms—Franka Emika Panda and Boston Dynamics Spot—it significantly enhances tactile capability in unstructured environments and human–robot interaction. Experiments achieve contact localization errors of 8.0 cm (Franka) and 7.2 cm (Spot), with inference speeds of ~2000 Hz on an RTX 3090 GPU. To our knowledge, this is the first approach enabling high-accuracy, full-body, proprioception-only tactile localization—advancing tactile sensing toward low-cost, general-purpose, and real-time deployment.
📝 Abstract
Robots can better interact with humans and unstructured environments through touch sensing. However, most commercial robots are not equipped with tactile skins, making it challenging to achieve even basic touch-sensing functions, such as contact localization. We present UniTac, a data-driven whole-body touch-sensing approach that uses only proprioceptive joint sensors and does not require the installation of additional sensors. Our approach enables a robot equipped solely with joint sensors to localize contacts. Our goal is to democratize touch sensing and provide an off-the-shelf tool for HRI researchers to provide their robots with touch-sensing capabilities. We validate our approach on two platforms: the Franka robot arm and the Spot quadruped. On Franka, we can localize contact to within 8.0 centimeters, and on Spot, we can localize to within 7.2 centimeters at around 2,000 Hz on an RTX 3090 GPU without adding any additional sensors to the robot. Project website: https://ivl.cs.brown.edu/research/unitac.