🤖 AI Summary
Existing vision-based tactile sensors (e.g., DIGIT, GelSight) rely on static structured light, resulting in low imaging contrast and limited deformation sensing accuracy. To address this, we propose a novel paradigm integrating dynamic structured illumination with multi-frame image fusion: temporally programmable structured light patterns are sequentially projected, and the acquired multi-view images are registered and fused via weighted gradient-domain optimization to achieve high-fidelity surface deformation reconstruction. This approach requires no hardware modification and is the first to introduce dynamic coded illumination and image fusion into vision-based tactile sensing, opening new avenues for adaptive sensor design. Experimental results demonstrate a 42% increase in image contrast, a 35% improvement in edge sharpness, and a 51% enhancement in background discriminability—enabling robust, sub-millimeter-resolution reconstruction of surface deformations.
📝 Abstract
Vision-based tactile sensors use structured light to measure deformation in their elastomeric interface. Until now, vision-based tactile sensors such as DIGIT and GelSight have been using a single, static pattern of structured light tuned to the specific form factor of the sensor. In this work, we investigate the effectiveness of dynamic illumination patterns, in conjunction with image fusion techniques, to improve the quality of sensing of vision-based tactile sensors. Specifically, we propose to capture multiple measurements, each with a different illumination pattern, and then fuse them together to obtain a single, higher-quality measurement. Experimental results demonstrate that this type of dynamic illumination yields significant improvements in image contrast, sharpness, and background difference. This discovery opens the possibility of retroactively improving the sensing quality of existing vision-based tactile sensors with a simple software update, and for new hardware designs capable of fully exploiting dynamic illumination.