Wrist2Finger: Sensing Fingertip Force for Force-Aware Hand Interaction with a Ring-Watch Wearable

📅 2025-10-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing hand pose tracking methods suffer from limitations in portability and force-sensing capability. This paper proposes a lightweight ring–watch cooperative wearable system that fuses fingertip IMU inertial measurements with single-channel wrist sEMG electromyographic signals to jointly reconstruct 3D hand pose and estimate individual fingertip contact forces. We introduce a novel dual-branch Transformer architecture with cross-modal cross-attention mechanisms and incorporate a biomechanics-inspired kinematic constraint loss to enhance estimation accuracy and real-time performance. Evaluated on 20 subjects, the system achieves a mean joint position error of 0.57 cm and a fingertip force estimation RMSE of 0.213 (r = 0.76). It has been successfully deployed in a real-time Unity-based virtual interaction system, demonstrating practical utility and robustness for natural, force-aware human–computer interaction.

Technology Category

Application Category

📝 Abstract
Hand pose tracking is essential for advancing applications in human-computer interaction. Current approaches, such as vision-based systems and wearable devices, face limitations in portability, usability, and practicality. We present a novel wearable system that reconstructs 3D hand pose and estimates per-finger forces using a minimal ring-watch sensor setup. A ring worn on the finger integrates an inertial measurement unit (IMU) to capture finger motion, while a smartwatch-based single-channel electromyography (EMG) sensor on the wrist detects muscle activations. By leveraging the complementary strengths of motion sensing and muscle signals, our approach achieves accurate hand pose tracking and grip force estimation in a compact wearable form factor. We develop a dual-branch transformer network that fuses IMU and EMG data with cross-modal attention to predict finger joint positions and forces simultaneously. A custom loss function imposes kinematic constraints for smooth force variation and realistic force saturation. Evaluation with 20 participants performing daily object interaction gestures demonstrates an average Mean Per Joint Position Error (MPJPE) of 0.57 cm and a fingertip force estimation (RMSE: 0.213, r=0.76). We showcase our system in a real-time Unity application, enabling virtual hand interactions that respond to user-applied forces. This minimal, force-aware tracking system has broad implications for VR/AR, assistive prosthetics, and ergonomic monitoring.
Problem

Research questions and friction points this paper is trying to address.

Estimating fingertip forces during hand interactions
Tracking 3D hand pose with minimal wearable sensors
Fusing IMU and EMG data for force-aware applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Ring-watch wearable with IMU and EMG sensors
Dual-branch transformer fuses motion and muscle data
Kinematic loss enables accurate force and pose tracking
🔎 Similar Papers
No similar papers found.
Y
Yingjing Xiao
East China Normal University, Shanghai, China
Z
Zhichao Huang
East China Normal University, Shanghai, China
J
Junbin Ren
East China Normal University, Shanghai, China
H
Haichuan Song
East China Normal University, Shanghai, China
Y
Yang Gao
East China Normal University, Shanghai, China
Y
Yuting Bai
South China University of Technology, Guangzhou, China
Zhanpeng Jin
Zhanpeng Jin
Xinshi Endowed Professor, South China University of Technology
Human-centered computingubiquitous computinghuman-computer interactionsmart health