A Distributed Multi-Modal Sensing Approach for Human Activity Recognition in Real-Time Human-Robot Collaboration

📅 2025-10-01
🏛️ IEEE Robotics and Automation Letters
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of accurately and reliably recognizing human hand activity intentions in real time during human-robot collaboration. To this end, the authors propose a distributed multimodal perception architecture that fuses data from a wearable inertial measurement unit (IMU)-based data glove and vision-based tactile sensors to enable high-precision, real-time identification of dynamic hand activities during physical human-robot interaction. The system employs a modular design combined with a real-time sequential classification algorithm, achieving consistently high recognition accuracy across offline evaluation, static real-time testing, and realistic collaborative scenarios. Experimental results validate the effectiveness and practicality of the proposed approach, demonstrating its potential to serve as a robust perceptual foundation for dynamic human-robot collaboration.

Technology Category

Application Category

📝 Abstract
Human activity recognition (HAR) is fundamental in human-robot collaboration (HRC), enabling robots to respond to and dynamically adapt to human intentions. This paper introduces a HAR system combining a modular data glove equipped with Inertial Measurement Units and a vision-based tactile sensor to capture hand activities in contact with a robot. We tested our activity recognition approach under different conditions, including offline classification of segmented sequences, real-time classification under static conditions, and a realistic HRC scenario. The experimental results show a high accuracy for all the tasks, suggesting that multiple collaborative settings could benefit from this multi-modal approach.
Problem

Research questions and friction points this paper is trying to address.

Human Activity Recognition
Human-Robot Collaboration
Multi-Modal Sensing
Real-Time Recognition
Innovation

Methods, ideas, or system contributions that make the work stand out.

multi-modal sensing
human activity recognition
inertial measurement unit
vision-based tactile sensor
human-robot collaboration
🔎 Similar Papers
No similar papers found.
V
Valerio Belcamino
TheEngineRoom, Department of Informatics Bioengineering, Robotics and System Engineering, University of Genoa, Genoa, Italy
N
Nhat Minh Dinh Le
Faculty of Mechanical Engineering, The University of Danang–University of Science and Technology, Danang, 54 Nguyen Luong Bang, 550000, Da Nang, Vietnam
Q
Quan Khanh Luu
Soft Haptics Lab, School of Materials Science, Japan Advanced Institute of Science and Technology, Nomi 923-1292, Japan
Alessandro Carfì
Alessandro Carfì
University of Genoa
roboticshuman robot interactionmachine learning
Van Anh Ho
Van Anh Ho
JAIST (Japan Advanced Institute of Science and Tech.
soft hapticssoft materials robotgraspingadhesion
Fulvio Mastrogiovanni
Fulvio Mastrogiovanni
University of Genoa, Istituto Italiano di Tecnologia
Cognitive SystemsCognitive RoboticsEmbodied CognitionEmbodied AIPhysical AI