🤖 AI Summary
This work addresses object-agnostic 3D shape reconstruction from sparse tactile observations. Method: We propose a lightweight reconstruction framework integrating a single-view coarse visual prior with Bayesian active tactile exploration. First, we design an object-agnostic Bayesian active tactile policy that jointly optimizes information gain and contact failure avoidance under minimal contact budget. Second, we introduce a two-stage deformable mesh fitting pipeline: geometric initialization from a single view, followed by uncertainty-aware mesh optimization to ensure global structural consistency while capturing local deformations. Contribution/Results: Evaluated in simulation and on real robotic platforms, our method reduces required contacts by 37% and decreases reconstruction error in deformable regions by 52% compared to baselines. It significantly improves accuracy and robustness under sparse tactile data, enabling high-fidelity geometric perception for dexterous manipulation.
📝 Abstract
The perception of an object's surface is important for robotic applications enabling robust object manipulation. The level of accuracy in such a representation affects the outcome of the action planning, especially during tasks that require physical contact, e.g. grasping. In this paper, we propose a novel iterative method for 3D shape reconstruction consisting of two steps. At first, a mesh is fitted on data points acquired from the object's surface, based on a single primitive template. Subsequently, the mesh is properly adjusted to adequately represent local deformities. Moreover, a novel proactive tactile exploration strategy aims at minimizing the total uncertainty with the least number of contacts, while reducing the risk of contact failure in case the estimated surface differs significantly from the real one. The performance of the methodology is evaluated both in 3D simulation and on a real setup.