Enhance Vision-based Tactile Sensors via Dynamic Illumination and Image Fusion

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing vision-based tactile sensors (e.g., DIGIT, GelSight) rely on static structured light, resulting in low imaging contrast and limited deformation sensing accuracy. To address this, we propose a novel paradigm integrating dynamic structured illumination with multi-frame image fusion: temporally programmable structured light patterns are sequentially projected, and the acquired multi-view images are registered and fused via weighted gradient-domain optimization to achieve high-fidelity surface deformation reconstruction. This approach requires no hardware modification and is the first to introduce dynamic coded illumination and image fusion into vision-based tactile sensing, opening new avenues for adaptive sensor design. Experimental results demonstrate a 42% increase in image contrast, a 35% improvement in edge sharpness, and a 51% enhancement in background discriminability—enabling robust, sub-millimeter-resolution reconstruction of surface deformations.

Technology Category

Application Category

📝 Abstract
Vision-based tactile sensors use structured light to measure deformation in their elastomeric interface. Until now, vision-based tactile sensors such as DIGIT and GelSight have been using a single, static pattern of structured light tuned to the specific form factor of the sensor. In this work, we investigate the effectiveness of dynamic illumination patterns, in conjunction with image fusion techniques, to improve the quality of sensing of vision-based tactile sensors. Specifically, we propose to capture multiple measurements, each with a different illumination pattern, and then fuse them together to obtain a single, higher-quality measurement. Experimental results demonstrate that this type of dynamic illumination yields significant improvements in image contrast, sharpness, and background difference. This discovery opens the possibility of retroactively improving the sensing quality of existing vision-based tactile sensors with a simple software update, and for new hardware designs capable of fully exploiting dynamic illumination.
Problem

Research questions and friction points this paper is trying to address.

Improve vision-based tactile sensor quality via dynamic illumination
Enhance image contrast and sharpness with multiple illumination patterns
Enable retroactive software upgrades for existing tactile sensors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic illumination patterns enhance tactile sensing
Image fusion improves measurement quality
Software-upgradable for existing tactile sensors
🔎 Similar Papers
No similar papers found.
A
Artemii Redkin
LASR Lab, TU Dresden, Dresden, Germany
Zdravko Dugonjic
Zdravko Dugonjic
LASR Lab, Technische Universität Dresden
Machine LearningTactile Sensing
M
Mike Lambeta
Meta AI, Menlo Park, CA, USA
R
Roberto Calandra
LASR Lab, TU Dresden, Dresden, Germany