๐ค AI Summary
Magnetic-based tactile sensors (MBTS) suffer from low spatial resolution due to sparse sensor arrays, limiting their applicability in high-precision robotic tactile tasks. To address this, we propose the first cross-modal super-resolution method specifically designed for magnetic tactile sensing: it leverages synchronously captured high-resolution visual tactile data to supervise a conditional generative model for real-time geometric reconstruction from low-resolution magnetic signals. Using a co-designed open-source visionโmagnetic composite sensor (magnetic sampling rate: 125 Hz), we implement a conditional variational autoencoder (CVAE) that explicitly models the mapping from sparse magnetic measurements to fine-grained geometric shapes. The method enables end-to-end, real-time super-resolution reconstruction (inference < 2.5 ms) while preserving physical interpretability. It significantly enhances spatial resolution and establishes a novel paradigm for high-fidelity tactile perception.
๐ Abstract
Magnetic-based tactile sensors (MBTS) combine the advantages of compact design and high-frequency operation but suffer from limited spatial resolution due to their sparse taxel arrays. This paper proposes SuperMag, a tactile shape reconstruction method that addresses this limitation by leveraging high-resolution vision-based tactile sensor (VBTS) data to supervise MBTS super-resolution. Co-designed, open-source VBTS and MBTS with identical contact modules enable synchronized data collection of high-resolution shapes and magnetic signals via a symmetric calibration setup. We frame tactile shape reconstruction as a conditional generative problem, employing a conditional variational auto-encoder to infer high-resolution shapes from low-resolution MBTS inputs. The MBTS achieves a sampling frequency of 125 Hz, whereas the shape reconstruction sustains an inference time within 2.5 ms. This cross-modality synergy advances tactile perception of the MBTS, potentially unlocking its new capabilities in high-precision robotic tasks.