🤖 AI Summary
Existing tactile databases predominantly rely on free exploration, hindering systematic analysis of how motion parameters—such as sliding velocity and direction—affect sliding tactile perception. To address this, we introduce the first multimodal texture database explicitly designed for sliding tactile perception. Using a biomimetic urethane rubber fingertip, we simultaneously acquire high-resolution tactile, visual, and auditory signals across five sliding velocities and eight directions, precisely characterizing probe–texture interaction dynamics. We propose a novel dual-dimensional modeling framework that jointly encodes velocity and direction—overcoming limitations of unimodal approaches—and develop a flexible fingertip sensing system, a precision trajectory control platform, and a multimodal temporal alignment annotation method. Experiments demonstrate >92% surface recognition accuracy, mean velocity estimation error <0.15 m/s, and 96.7% direction classification accuracy—significantly enhancing model generalizability and fine-grained discriminative capability.
📝 Abstract
Human perception integrates multisensory information, with tactile perception playing a key role in object and surface recognition. While human-machine interfaces with haptic modalities offer enhanced system performance, existing datasets focus primarily on visual data, overlooking comprehensive haptic information. Previous haptic texture databases have recorded sound and acceleration signals, but often ignore the nuanced differences between probe-texture and finger-texture interactions. Recognizing this shortcoming, we present the Cluster Haptic Texture Database, a multimodal dataset that records visual, auditory, and haptic signals from an artificial urethane rubber fingertip interacting with different textured surfaces. This database, designed to mimic the properties of the human finger, includes five velocity levels and eight directional variations, providing a comprehensive study of tactile interactions. Our evaluations reveal the effectiveness of classifiers trained on this dataset in identifying surfaces, and the subtleties of estimating velocity and direction for each surface.