Active Tactile Exploration for Rigid Body Pose and Shape Estimation

📅 2025-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of online estimating the pose and shape of unknown rigid objects using only tactile sensing during robotic manipulation. We propose an active tactile exploration framework that integrates physics-constrained modeling—thereby avoiding numerical stiffness from rigid contact—and an information-gain-driven exploration strategy, leveraging differentiable physics simulation and gradient-based optimization for joint pose and geometric identification. Our key contributions are: (i) a unified objective function jointly optimizing physics constraint violation loss and expected information gain; and (ii) a low-mobility-cost hybrid sampling scheme combining stochastic and active strategies. Experiments on both simulation and real robotic platforms demonstrate that high-fidelity reconstruction of cube and convex polyhedral objects—achieving accurate pose and shape estimation—is accomplished within approximately 10 seconds of tactile interaction, significantly improving online recognition efficiency and robustness.

Technology Category

Application Category

📝 Abstract
General robot manipulation requires the handling of previously unseen objects. Learning a physically accurate model at test time can provide significant benefits in data efficiency, predictability, and reuse between tasks. Tactile sensing can compliment vision with its robustness to occlusion, but its temporal sparsity necessitates careful online exploration to maintain data efficiency. Direct contact can also cause an unrestrained object to move, requiring both shape and location estimation. In this work, we propose a learning and exploration framework that uses only tactile data to simultaneously determine the shape and location of rigid objects with minimal robot motion. We build on recent advances in contact-rich system identification to formulate a loss function that penalizes physical constraint violation without introducing the numerical stiffness inherent in rigid-body contact. Optimizing this loss, we can learn cuboid and convex polyhedral geometries with less than 10s of randomly collected data after first contact. Our exploration scheme seeks to maximize Expected Information Gain and results in significantly faster learning in both simulated and real-robot experiments. More information can be found at https://dairlab.github.io/activetactile
Problem

Research questions and friction points this paper is trying to address.

Estimating rigid object shape and pose using tactile sensing
Developing data-efficient exploration for physical model learning
Overcoming occlusion and motion challenges in robotic manipulation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses tactile data for rigid object shape and location estimation
Formulates loss function penalizing physical constraint violations
Exploration scheme maximizes Expected Information Gain for learning
🔎 Similar Papers
No similar papers found.
Ethan K. Gordon
Ethan K. Gordon
General Robotics, Automation, Sensing, and Perception (GRASP) Laboratory, University of Pennsylvania, Philadelphia, PA, USA 19104
B
Bruke Baraki
General Robotics, Automation, Sensing, and Perception (GRASP) Laboratory, University of Pennsylvania, Philadelphia, PA, USA 19104
H
Hien Bui
General Robotics, Automation, Sensing, and Perception (GRASP) Laboratory, University of Pennsylvania, Philadelphia, PA, USA 19104
Michael Posa
Michael Posa
Associate Professor, University of Pennsylvania
RoboticsControlOptimizationContact dynamicsMachine Learning