🤖 AI Summary
Existing vision-based tactile methods rely on deformation sensing, making them vulnerable to occlusion and illumination variations, and incapable of directly capturing fingertip pressure distribution. This work proposes a novel image-based tactile paradigm grounded in skin optical chromaticity shift—rather than mechanical deformation—induced by contact forces. Specifically, high-resolution cameras capture minute CIELAB color shifts in a soft, elastomeric tactile skin under load; a physics-informed chromaticity–pressure mapping model is then established. By integrating multispectral imaging, a lightweight neural network, and calibration-compensation algorithms, the approach achieves high-accuracy, sensor-free force estimation. It attains a prediction error of ±0.08 N across a 0–5 N range, a spatial resolution of 2 mm, and exhibits threefold greater robustness to illumination interference compared to state-of-the-art vision-based tactile methods. This framework establishes a high-fidelity, robust pathway for tactile perception in robotic teleoperation and natural human–robot interaction.