🤖 AI Summary
Grasping unknown-shaped objects buried in granular media remains challenging due to severe visual occlusion and high noise in tactile signals. Method: We propose GEOTACT—the first fully tactile-driven, end-to-end learned framework for autonomous buried-object retrieval. It employs a reinforcement learning paradigm integrated with sensor noise modeling, jointly optimizing granular physics simulation, tactile encoding, and behavioral policy. A curriculum-based simulation training strategy enables zero-shot transfer to real sandy environments. Contribution/Results: The learned policy autonomously discovers noise-robust pushing behaviors and adaptive alignment guidance mechanisms. Trained on only seven simple shapes, GEOTACT robustly retrieves 35 diverse objects—including rigid, deformable, and articulated ones—significantly overcoming the fundamental uncertainty bottleneck in tactile perception.
📝 Abstract
We introduce GEOTACT, the first robotic system capable of grasping and retrieving objects of potentially unknown shapes buried in a granular environment. While important in many applications, ranging from mining and exploration to search and rescue, this type of interaction with granular media is difficult due to the uncertainty stemming from visual occlusion and noisy contact signals. To address these challenges, we use a learning method relying exclusively on touch feedback, trained end-to-end with simulated sensor noise. We show that our problem formulation leads to the natural emergence of learned pushing behaviors that the manipulator uses to reduce uncertainty and funnel the object to a stable grasp despite spurious and noisy tactile readings. We introduce a training curriculum that bootstraps learning in simulated granular environments, enabling zero-shot transfer to real hardware. Despite being trained only on seven objects with primitive shapes, our method is shown to successfully retrieve 35 different objects, including rigid, deformable, and articulated objects with complex shapes.