🤖 AI Summary
Conventional whole-body tactile skins for robots suffer from poor geometric conformity and task-specific adaptability due to rigid, modular “one-size-fits-all” designs. Method: This paper proposes a task-driven, shape-agnostic end-to-end computational design paradigm integrating parametric mesh generation, capacitive sensing simulation and optimization, multi-material 3D printing process planning, and robot-centric topological modeling—enabling fully automated design and fabrication of customized tactile skins directly from input geometry. Contribution/Results: The framework overcomes platform-specific limitations, supporting cross-platform deployment and context-aware customization. Six function-specialized whole-body tactile skins were successfully generated and deployed on a Franka Emika Panda robot. Experimental evaluation in human–robot interaction tasks demonstrates high conformality, sensing robustness, and dynamic scene adaptability.
📝 Abstract
Developing whole-body tactile skins for robots remains a challenging task, as existing solutions often prioritize modular, one-size-fits-all designs, which, while versatile, fail to account for the robot's specific shape and the unique demands of its operational context. In this work, we introduce GenTact Toolbox, a computational pipeline for creating versatile whole-body tactile skins tailored to both robot shape and application domain. Our method includes procedural mesh generation for conforming to a robot's topology, task-driven simulation to refine sensor distribution, and multi-material 3D printing for shape-agnostic fabrication. We validate our approach by creating and deploying six capacitive sensing skins on a Franka Research 3 robot arm in a human-robot interaction scenario. This work represents a shift from"one-size-fits-all"tactile sensors toward context-driven, highly adaptable designs that can be customized for a wide range of robotic systems and applications. The project website is available at https://hiro-group.ronc.one/gentacttoolbox