🤖 AI Summary
This work proposes a tactile perception framework that integrates multimodal tactile interactions with information-driven exploration to efficiently reconstruct object shapes. Addressing the inefficiency of conventional grasp-based interactions, the study systematically compares three contact modalities—grasp-and-release, fingertip sliding, and palm rolling—and reports for the first time that sliding and rolling significantly enhance information gain. The approach combines an information-theoretic active exploration strategy with a learned shape completion model to dynamically guide subsequent tactile sampling. Experimental validation on a UR5e robotic arm equipped with an Inspire-Robots dexterous hand demonstrates that the proposed method reduces the number of physical interactions by 34% and improves reconstruction accuracy by 55% compared to baseline approaches, while exhibiting robust performance across diverse geometric objects.
📝 Abstract
Tactile sensing allows robots to gather detailed geometric information about objects through physical interaction, complementing vision-based approaches. However, efficiently acquiring useful tactile data remains challenging due to the time-consuming nature of physical contact and the need to strategically choose contact locations that maximize information gain while minimizing physical interactions. This paper studies how different contact modes affect object shape reconstruction using a tactile-enabled dexterous gripper. We compare three contact interaction modes: grasp-releasing, sliding induced by finger-grazing, and palm-rolling. These contact modes are combined with an information-theoretic exploration framework that guides subsequent sampling locations using a shape completion model. Our results show that the improved tactile sensing efficiency of finger-grazing and palm-rolling translates into faster convergence in shape reconstruction, requiring 34% fewer physical interactions while improving reconstruction accuracy by 55%. We validate our approach using a UR5e robot arm equipped with an Inspire-Robots Dexterous Hand, showing robust performance across primitive object geometries.