🤖 AI Summary
Tactile gesture recognition suffers from poor discriminability among visually similar gestures, as existing approaches rely solely on static pressure distribution and fail to capture dynamic contact evolution. To address this, we propose an optical-flow-enhanced tactile gesture recognition method that explicitly models temporal motion patterns during contact by integrating dense optical flow—computed directly from raw tactile image sequences—as an auxiliary dynamic feature. Crucially, our approach requires no architectural modifications to standard classifiers; instead, the extracted inter-frame displacement fields are fused with conventional tactile features before classification. Evaluated on a public tactile gesture benchmark, the method achieves a 9% absolute accuracy improvement over baseline methods. Lightweight and plug-and-play, it significantly enhances dynamic gesture discrimination without increasing inference latency, thereby establishing a new paradigm for low-latency, high-accuracy tactile interaction.
📝 Abstract
Tactile gesture recognition systems play a crucial role in Human-Robot Interaction (HRI) by enabling intuitive communication between humans and robots. The literature mainly addresses this problem by applying machine learning techniques to classify sequences of tactile images encoding the pressure distribution generated when executing the gestures. However, some gestures can be hard to differentiate based on the information provided by tactile images alone. In this paper, we present a simple yet effective way to improve the accuracy of a gesture recognition classifier. Our approach focuses solely on processing the tactile images used as input by the classifier. In particular, we propose to explicitly highlight the dynamics of the contact in the tactile image by computing the dense optical flow. This additional information makes it easier to distinguish between gestures that produce similar tactile images but exhibit different contact dynamics. We validate the proposed approach in a tactile gesture recognition task, showing that a classifier trained on tactile images augmented with optical flow information achieved a 9% improvement in gesture classification accuracy compared to one trained on standard tactile images.