🤖 AI Summary
This work addresses the core challenge in soft robotic tactile sensing: strain-induced artifacts in electrical impedance tomography (EIT) measurements severely degrade tactile image reconstruction accuracy on highly deformable surfaces. To this end, we propose— for the first time—a novel end-to-end deep learning framework that explicitly decouples surface deformation estimation from EIT signal inversion. The framework jointly processes raw EIT boundary voltages, real-time deformation sensor readings, and employs a custom multimodal neural network featuring an EIT–Deformation feature fusion module. Comprehensive simulations demonstrate exceptional reconstruction fidelity, achieving correlation coefficients of 0.966–0.9999, PSNRs of 28.7–55.5 dB, and relative image errors as low as 0.0107–0.0805. Experimental validation on a hydrogel-based electronic skin under large, dynamic deformations further confirms the method’s high accuracy and robustness in practical soft tactile sensing scenarios.
📝 Abstract
Electrical impedance tomography (EIT)-based tactile sensors offer cost-effective and scalable solutions for robotic sensing, especially promising for soft robots. However, a major issue of EIT-based tactile sensors when applied to highly deformable objects is their performance degradation due to surface deformations. This limitation stems from their inherent sensitivity to strain, which is particularly exacerbated in soft bodies, thus requiring dedicated data interpretation to disentangle the parameter being measured and the signal deriving from shape changes. This has largely limited their practical implementations. This article presents a machine learning-assisted tactile sensing approach to address this challenge by tracking surface deformations and segregating this contribution in the signal readout during tactile sensing. We first capture the deformations of the target object, followed by tactile reconstruction using a deep learning model specifically designed to process and fuse EIT data and deformation information. Validations using numerical simulations achieved high correlation coefficients (0.9660–0.9999), peak signal-to-noise ratios (SNRs) (28.7221–55.5264 dB), and low relative image errors (RIEs) (0.0107–0.0805). Experimental validations, using a hydrogel-based EIT e-skin under various deformation scenarios, further demonstrated the effectiveness of the proposed approach in real-world settings. The findings could underpin enhanced tactile interaction in soft and highly deformable robotic applications.