🤖 AI Summary
This study addresses the limited sensitivity of traditional post-stroke upper-limb functional assessments, which rely on ordinal scales or task completion time and often fail to capture subtle differences in movement quality. The authors propose a novel method that leverages standard monocular video—without requiring depth sensors or calibration objects—to estimate joint angles of the fingers, arm, and torso via computer vision and align them to a world coordinate system. By applying unsupervised dimensionality reduction to movement patterns during the Box and Block Test, the approach successfully differentiates between healthy controls and stroke survivors across 136 video recordings. Notably, it uncovers distinct abnormal postural patterns among patients with identical clinical scores, demonstrating, for the first time, a calibration-free, camera-based quantification of upper-limb movement quality that significantly outperforms conventional timing and rating methods.
📝 Abstract
Standard clinical assessments of upper-extremity motor function after stroke either rely on ordinal scoring, which lacks sensitivity, or time-based task metrics, which do not capture movement quality. In this work, we present a computer vision-based framework for analysis of upper-extremity movement during the Box and Block Test (BBT) through world-aligned joint angles of fingers, arm, and trunk without depth sensors or calibration objects. We apply this framework to a dataset of 136 BBT recordings collected from 48 healthy individuals and 7 individuals post stroke. Using unsupervised dimensionality reduction of joint-angle features, we analyze movement patterns without relying on expert clinical labels. The resulting embeddings show separation between healthy movement patterns and stroke-related movement deviations. Importantly, some patients with the same BBT scores can be separated with different postural patterns. These results show that world-aligned joint angles can capture meaningful information of upper-extremity functions beyond standard time-based BBT scores, with no effort from the clinician other than monocular video recordings of the patient using a phone or camera. This work highlights the potential of a camera-based, calibration-free framework to measure movement quality in clinical assessments without changing the widely adopted clinical routine.