🤖 AI Summary
Real-time perception of multidimensional tactile states remains challenging in rich-contact manipulation tasks. Method: This paper introduces an active acoustic tactile sensing paradigm, featuring a piezoelectric bimanual acoustic gripper that excites and receives acoustic waves propagating through object interiors to simultaneously estimate material properties, grasp pose, internal structural orientation, and contact type. It pioneers the integration of physics-informed active acoustic wave propagation modeling with deep learning, establishing an end-to-end tactile closed-loop control framework—operating without visual or force sensing. Results: Deployed on a UR5 platform, the system achieves >92% success rate in peg-in-hole insertion using acoustic feedback alone, while concurrently performing object classification, grasp localization, internal pose estimation, and contact-type recognition. This approach overcomes dimensional limitations of conventional tactile sensing, enabling robust, minimally invasive, and multimodal information-coupled dexterous manipulation.
📝 Abstract
The acoustic response of an object can reveal a lot about its global state, for example its material properties or the extrinsic contacts it is making with the world. In this work, we build an active acoustic sensing gripper equipped with two piezoelectric fingers: one for generating signals, the other for receiving them. By sending an acoustic vibration from one finger to the other through an object, we gain insight into an object's acoustic properties and contact state. We use this system to classify objects, estimate grasping position, estimate poses of internal structures, and classify the types of extrinsic contacts an object is making with the environment. Using our contact type classification model, we tackle a standard long-horizon manipulation problem: peg insertion. We use a simple simulated transition model based on the performance of our sensor to train an imitation learning policy that is robust to imperfect predictions from the classifier. We finally demonstrate the policy on a UR5 robot with active acoustic sensing as the only feedback.