VibeCheck: Using Active Acoustic Tactile Sensing for Contact-Rich Manipulation

📅 2025-04-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-time perception of multidimensional tactile states remains challenging in rich-contact manipulation tasks. Method: This paper introduces an active acoustic tactile sensing paradigm, featuring a piezoelectric bimanual acoustic gripper that excites and receives acoustic waves propagating through object interiors to simultaneously estimate material properties, grasp pose, internal structural orientation, and contact type. It pioneers the integration of physics-informed active acoustic wave propagation modeling with deep learning, establishing an end-to-end tactile closed-loop control framework—operating without visual or force sensing. Results: Deployed on a UR5 platform, the system achieves >92% success rate in peg-in-hole insertion using acoustic feedback alone, while concurrently performing object classification, grasp localization, internal pose estimation, and contact-type recognition. This approach overcomes dimensional limitations of conventional tactile sensing, enabling robust, minimally invasive, and multimodal information-coupled dexterous manipulation.

Technology Category

Application Category

📝 Abstract
The acoustic response of an object can reveal a lot about its global state, for example its material properties or the extrinsic contacts it is making with the world. In this work, we build an active acoustic sensing gripper equipped with two piezoelectric fingers: one for generating signals, the other for receiving them. By sending an acoustic vibration from one finger to the other through an object, we gain insight into an object's acoustic properties and contact state. We use this system to classify objects, estimate grasping position, estimate poses of internal structures, and classify the types of extrinsic contacts an object is making with the environment. Using our contact type classification model, we tackle a standard long-horizon manipulation problem: peg insertion. We use a simple simulated transition model based on the performance of our sensor to train an imitation learning policy that is robust to imperfect predictions from the classifier. We finally demonstrate the policy on a UR5 robot with active acoustic sensing as the only feedback.
Problem

Research questions and friction points this paper is trying to address.

Classify objects and estimate grasping positions using acoustic sensing
Detect extrinsic contact types for manipulation tasks like peg insertion
Train robust imitation learning policies with acoustic feedback
Innovation

Methods, ideas, or system contributions that make the work stand out.

Active acoustic sensing gripper with piezoelectric fingers
Acoustic vibration for object property and contact analysis
Imitation learning policy using acoustic feedback
🔎 Similar Papers
No similar papers found.
Kaidi Zhang
Kaidi Zhang
Purdue University
roboticstactile sensing
D
Do-Gon Kim
Dept. of Mechanical Engineering, Columbia University
Eric T. Chang
Eric T. Chang
Phd Student, Columbia University
H
Hua-Hsuan Liang
Dept. of Computer Science, Columbia University
Zhanpeng He
Zhanpeng He
Stanford University, Amazon Robotics
robot learningreinforcement learningtactile sensors
K
Kate Lampo
Dept. of Mechanical Engineering, Columbia University
P
Philippe Wu
Dept. of Mechanical Engineering, Columbia University
Ioannis Kymissis
Ioannis Kymissis
Kenneth Brayer Professor of Electrical Engineering, Columbia University
Electrical engineeringmaterial scienceorganic semiconductorsthin film systems
M
M. Ciocarlie
Dept. of Mechanical Engineering, Columbia University