🤖 AI Summary
Understanding the physical determinants of tactile comfort in garments remains challenging due to the lack of controlled, high-fidelity tactile data capturing dynamic finger–fabric interactions.
Method: We propose a robotic-arm-based tactile acquisition system that emulates fingertip sliding with precise control over velocity, direction, and normal force, while synchronously recording multimodal signals—tactile force, acceleration, and audio. Crucially, it enables non-destructive, full-garment tactile sensing with fine-grained motion annotations (e.g., speed, trajectory, contact state).
Contribution/Results: We introduce the first motion-parameter-annotated garment tactile database. Experiments demonstrate that incorporating motion information significantly improves material classification accuracy by +12.7%, validating its value for building more robust and scalable models of tactile perception. This work establishes a physically grounded, experimentally validated framework for linking fabric mechanics to human tactile comfort.
📝 Abstract
The tactile sensation of clothing is critical to wearer comfort. To reveal physical properties that make clothing comfortable, systematic collection of tactile data during sliding motion is required. We propose a robotic arm-based system for collecting tactile data from intact garments. The system performs stroking measurements with a simulated fingertip while precisely controlling speed and direction, enabling creation of motion-labeled, multimodal tactile databases. Machine learning evaluation showed that including motion-related parameters improved identification accuracy for audio and acceleration data, demonstrating the efficacy of motion-related labels for characterizing clothing tactile sensation. This system provides a scalable, non-destructive method for capturing tactile data of clothing, contributing to future studies on fabric perception and reproduction.