TacSL: A Library for Visuotactile Sensor Simulation and Learning

📅 2024-08-12
🏛️ IEEE Transactions on robotics
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of complex contact modeling, signal distortion in tactile image generation, and difficult sim-to-real transfer in vision-tactile sensing, this paper introduces TacSL—the first GPU-accelerated simulation and learning library for visuo-tactile fusion perception. Our approach comprises: (1) physics-consistent tactile image simulation at 200× real-time speed using Isaac Sim, enabling high-fidelity rendering of contact force distributions; (2) an online reinforcement learning algorithm—Heterogeneous Actor-Critic Distillation (AACD)—for efficient multimodal policy optimization; and (3) an end-to-end sim-to-real transfer toolchain. Evaluated on contact-intensive tasks including grasping and assembly, TacSL significantly improves both policy training efficiency and real-robot deployment performance. It provides a scalable, reproducible, open-source infrastructure for tactile intelligence research.

Technology Category

Application Category

📝 Abstract
For both humans and robots, the sense of touch, known as tactile sensing, is critical for performing contact-rich manipulation tasks. Three key challenges in robotic tactile sensing are 1) interpreting sensor signals, 2) generating sensor signals in novel scenarios, and 3) learning sensor-based policies. For visuotactile sensors, interpretation has been facilitated by their close relationship with vision sensors (e.g., RGB cameras). However, generation is still difficult, as visuotactile sensors typically involve contact, deformation, illumination, and imaging, all of which are expensive to simulate; in turn, policy learning has been challenging, as simulation cannot be leveraged for large-scale data collection. We present TacSL (taxel), a library for GPU-based visuotactile sensor simulation and learning. TacSL can be used to simulate visuotactile images and extract contact-force distributions over $200 imes$ faster than the prior state-of-the-art, all within the widely-used Isaac Simulator. Furthermore, TacSL provides a learning toolkit containing multiple sensor models, contact-intensive training environments, and online/offline algorithms that can facilitate policy learning for sim-to-real applications. On the algorithmic side, we introduce a novel online reinforcement-learning algorithm called asymmetric actor-critic distillation (AACD), designed to effectively and efficiently learn tactile-based policies in simulation that can transfer to the real world. Finally, we demonstrate the utility of our library and algorithms by evaluating the benefits of distillation and multimodal sensing for contact-rich manipulation tasks, and most critically, performing sim-to-real transfer. Supplementary videos and results are at https://iakinola23.github.io/tacsl/.
Problem

Research questions and friction points this paper is trying to address.

Simulate visuotactile sensor signals efficiently
Facilitate policy learning for tactile-based tasks
Enable sim-to-real transfer for robotic manipulation
Innovation

Methods, ideas, or system contributions that make the work stand out.

GPU-based visuotactile sensor simulation
Asymmetric actor-critic distillation algorithm
Sim-to-real transfer for tactile policies
🔎 Similar Papers
No similar papers found.