GenTact Toolbox: A Computational Design Pipeline to Procedurally Generate Context-Driven 3D Printed Whole-Body Artificial Skins

📅 2024-12-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional whole-body tactile skins for robots suffer from poor geometric conformity and task-specific adaptability due to rigid, modular “one-size-fits-all” designs. Method: This paper proposes a task-driven, shape-agnostic end-to-end computational design paradigm integrating parametric mesh generation, capacitive sensing simulation and optimization, multi-material 3D printing process planning, and robot-centric topological modeling—enabling fully automated design and fabrication of customized tactile skins directly from input geometry. Contribution/Results: The framework overcomes platform-specific limitations, supporting cross-platform deployment and context-aware customization. Six function-specialized whole-body tactile skins were successfully generated and deployed on a Franka Emika Panda robot. Experimental evaluation in human–robot interaction tasks demonstrates high conformality, sensing robustness, and dynamic scene adaptability.

Technology Category

Application Category

📝 Abstract
Developing whole-body tactile skins for robots remains a challenging task, as existing solutions often prioritize modular, one-size-fits-all designs, which, while versatile, fail to account for the robot's specific shape and the unique demands of its operational context. In this work, we introduce GenTact Toolbox, a computational pipeline for creating versatile whole-body tactile skins tailored to both robot shape and application domain. Our method includes procedural mesh generation for conforming to a robot's topology, task-driven simulation to refine sensor distribution, and multi-material 3D printing for shape-agnostic fabrication. We validate our approach by creating and deploying six capacitive sensing skins on a Franka Research 3 robot arm in a human-robot interaction scenario. This work represents a shift from"one-size-fits-all"tactile sensors toward context-driven, highly adaptable designs that can be customized for a wide range of robotic systems and applications. The project website is available at https://hiro-group.ronc.one/gentacttoolbox
Problem

Research questions and friction points this paper is trying to address.

Develops context-driven 3D printed whole-body tactile skins for robots.
Addresses limitations of modular, one-size-fits-all tactile sensor designs.
Customizes tactile skins for robot shape and specific application demands.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Procedural mesh generation for robot topology
Task-driven simulation for sensor distribution
Multi-material 3D printing for shape-agnostic fabrication
🔎 Similar Papers
No similar papers found.
C
Carson Kohlbrenner
University of Colorado Boulder, 1111 Engineering Drive, Boulder, CO USA
Caleb Escobedo
Caleb Escobedo
PhD Student in Computer Science, University of Colorado Boulder
RoboticsHuman Robot InteractionControlAI
S
S. Sandra Bae
University of Colorado Boulder, 1111 Engineering Drive, Boulder, CO USA
A
Alexander Dickhans
University of Colorado Boulder, 1111 Engineering Drive, Boulder, CO USA
Alessandro Roncone
Alessandro Roncone
Assistant Professor, University of Colorado, Boulder, CO, U.S.A
RoboticsHuman-Robot InteractionHuman-AI TeamingTactile SensingHumanoids