Cluster Haptic Texture Database: Haptic Texture Database with Variety in Velocity and Direction of Sliding Contacts

📅 2024-07-23
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing tactile databases predominantly rely on free exploration, hindering systematic analysis of how motion parameters—such as sliding velocity and direction—affect sliding tactile perception. To address this, we introduce the first multimodal texture database explicitly designed for sliding tactile perception. Using a biomimetic urethane rubber fingertip, we simultaneously acquire high-resolution tactile, visual, and auditory signals across five sliding velocities and eight directions, precisely characterizing probe–texture interaction dynamics. We propose a novel dual-dimensional modeling framework that jointly encodes velocity and direction—overcoming limitations of unimodal approaches—and develop a flexible fingertip sensing system, a precision trajectory control platform, and a multimodal temporal alignment annotation method. Experiments demonstrate >92% surface recognition accuracy, mean velocity estimation error <0.15 m/s, and 96.7% direction classification accuracy—significantly enhancing model generalizability and fine-grained discriminative capability.

Technology Category

Application Category

📝 Abstract
Human perception integrates multisensory information, with tactile perception playing a key role in object and surface recognition. While human-machine interfaces with haptic modalities offer enhanced system performance, existing datasets focus primarily on visual data, overlooking comprehensive haptic information. Previous haptic texture databases have recorded sound and acceleration signals, but often ignore the nuanced differences between probe-texture and finger-texture interactions. Recognizing this shortcoming, we present the Cluster Haptic Texture Database, a multimodal dataset that records visual, auditory, and haptic signals from an artificial urethane rubber fingertip interacting with different textured surfaces. This database, designed to mimic the properties of the human finger, includes five velocity levels and eight directional variations, providing a comprehensive study of tactile interactions. Our evaluations reveal the effectiveness of classifiers trained on this dataset in identifying surfaces, and the subtleties of estimating velocity and direction for each surface.
Problem

Research questions and friction points this paper is trying to address.

Lack of controlled haptic data collection methods
Insufficient analysis of motion parameters' impact on tactile perception
Need for multimodal datasets for haptic research and applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

3-axis machine with artificial finger control
Multimodal dataset with 118 textured surfaces
160 conditions per surface for comprehensive analysis
🔎 Similar Papers
No similar papers found.
M
Michikuni Eguchi
Metaverse Lab, Cluster, Inc., 8-9-5 Nishigotanda, Shinagawa, Tokyo, Japan
M
Madoka Ito
Graduate School of Comprehensive Human Sciences, University of Tsukuba, 1-2 Kasuga, Tsukuba, Ibaraki, Japan
T
Tomohiro Hayase
Metaverse Lab, Cluster, Inc., 8-9-5 Nishigotanda, Shinagawa, Tokyo, Japan
Yuichi Hiroi
Yuichi Hiroi
Metaverse Lab, Cluster Inc.
Augmented RealityVirtual RealityVision Augmentation
Takefumi Hiraki
Takefumi Hiraki
University of Tsukuba / Cluster Metaverse Lab
Virtual RealityMetaverseHuman-Computer InteractionHapticsSoft Robotics