Simultaneous Estimation of Manipulation Skill and Hand Grasp Force from Forearm Ultrasound Images

📅 2025-02-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of simultaneously decoding both hand manipulation skill categories and continuous grip force from single-source forearm ultrasound images, aiming to support high-fidelity teleoperation and skill transfer. Method: We propose the first unified modeling framework for joint skill classification and force estimation from the same ultrasound signal, departing from conventional single-task biosignal decoding paradigms. A hybrid CNN-LSTM architecture is employed, trained and evaluated via five-fold cross-validation on multi-subject data with fine-grained annotations. Contribution/Results: The method achieves 94.87% ± 10.16% accuracy in skill classification and a root-mean-square error (RMSE) of 0.51 ± 0.19 N in grip force estimation. It enables fine-grained, dual-objective decoding of motor intent from ultrasound—a non-invasive modality—thereby establishing a novel paradigm for non-invasive human–robot skill learning and transfer.

Technology Category

Application Category

📝 Abstract
Accurate estimation of human hand configuration and the forces they exert is critical for effective teleoperation and skill transfer in robotic manipulation. A deeper understanding of human interactions with objects can further enhance teleoperation performance. To address this need, researchers have explored methods to capture and translate human manipulation skills and applied forces to robotic systems. Among these, biosignal-based approaches, particularly those using forearm ultrasound data, have shown significant potential for estimating hand movements and finger forces. In this study, we present a method for simultaneously estimating manipulation skills and applied hand force using forearm ultrasound data. Data collected from seven participants were used to train deep learning models for classifying manipulation skills and estimating grasp force. Our models achieved an average classification accuracy of 94.87 percent plus or minus 10.16 percent for manipulation skills and an average root mean square error (RMSE) of 0.51 plus or minus 0.19 Newtons for force estimation, as evaluated using five-fold cross-validation. These results highlight the effectiveness of forearm ultrasound in advancing human-machine interfacing and robotic teleoperation for complex manipulation tasks. This work enables new and effective possibilities for human-robot skill transfer and tele-manipulation, bridging the gap between human dexterity and robotic control.
Problem

Research questions and friction points this paper is trying to address.

Ultrasound Imaging
Motor Skill Assessment
Robotics Imitation Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Ultrasound Data
Deep Learning
Hand Gesture and Grip Force Recognition
🔎 Similar Papers
No similar papers found.
K
Keshav Bimbraw
Robotics Engineering, Worcester Polytechnic Institute, Worcester, MA, USA
S
Srikar Nekkanti
Data Science, Worcester Polytechnic Institute, Worcester, MA, USA
D
Daniel B. Tiller II
Inova Medical Group, Alexandria, VA, USA
M
Mihir Deshmukh
Robotics Engineering, Worcester Polytechnic Institute, Worcester, MA, USA
B
Berk Çalli
Robotics Engineering, Worcester Polytechnic Institute, Worcester, MA, USA
R
Robert D. Howe
Harvard Paulson School of Engineering and Applied Sciences, Cambridge, MA, USA
Haichong K. Zhang
Haichong K. Zhang
Worcester Polytechnic Institute
Medical UltrasoundRoboticsPhotoacousticsMedical Imaging