🤖 AI Summary
Soft pneumatic robots lack high-resolution, robust, and easily integrable tactile and proprioceptive sensing—hindering real-world deployment. To address this, we propose a novel vision-based perception paradigm leveraging embedded micro-cameras to reconstruct internal deformation of pneumatic soft arms with unprecedented accuracy, enabling simultaneous proprioceptive pose estimation and external contact detection. Our approach integrates optical-dynamic co-simulation, an end-to-end simulation-to-reality (Sim2Real) transfer pipeline, and an image-driven deformation decoding network—achieving zero-shot Sim2Real adaptation without any real-world calibration data. Experimental results demonstrate sub-millimeter pose estimation accuracy and reliable tactile responsiveness during complex manipulation tasks. The system significantly enhances perceptual robustness and deployment efficiency of soft robots in industrial assembly and human–robot interaction scenarios.
📝 Abstract
Soft pneumatic robot manipulators are popular in industrial and human-interactive applications due to their compliance and flexibility. However, deploying them in real-world scenarios requires advanced sensing for tactile feedback and proprioception. Our work presents a novel vision-based approach for sensorizing soft robots. We demonstrate our approach on PneuGelSight, a pioneering pneumatic manipulator featuring high-resolution proprioception and tactile sensing via an embedded camera. To optimize the sensor's performance, we introduce a comprehensive pipeline that accurately simulates its optical and dynamic properties, facilitating a zero-shot knowledge transition from simulation to real-world applications. PneuGelSight and our sim-to-real pipeline provide a novel, easily implementable, and robust sensing methodology for soft robots, paving the way for the development of more advanced soft robots with enhanced sensory capabilities.