🤖 AI Summary
To address three key challenges in sparse IMU-based motion capture—pose ambiguity, integration drift, and poor anthropometric adaptability—this paper proposes an online joint estimation method integrating six UWB ranging sensors with IMUs. We introduce a novel uncertainty-driven tightly coupled Unscented Kalman Filter (UKF) framework that jointly models sensor noise, human kinematic constraints, and subject-specific anthropometric priors, enabling real-time co-estimation of dynamic pose and body shape parameters. Evaluated on both synthetic and real-world datasets, our method significantly suppresses long-term drift and achieves superior orientation accuracy over state-of-the-art approaches. It demonstrates strong robustness to limb occlusion, vigorous motion, and inter-subject anatomical variability. This work is the first to embed UWB range measurements and anatomical priors into a tightly coupled filtering architecture, establishing a new high-precision, self-adaptive paradigm for wearable markerless motion capture.
📝 Abstract
Sparse wearable inertial measurement units (IMUs) have gained popularity for estimating 3D human motion. However, challenges such as pose ambiguity, data drift, and limited adaptability to diverse bodies persist. To address these issues, we propose UMotion, an uncertainty-driven, online fusing-all state estimation framework for 3D human shape and pose estimation, supported by six integrated, body-worn ultra-wideband (UWB) distance sensors with IMUs. UWB sensors measure inter-node distances to infer spatial relationships, aiding in resolving pose ambiguities and body shape variations when combined with anthropometric data. Unfortunately, IMUs are prone to drift, and UWB sensors are affected by body occlusions. Consequently, we develop a tightly coupled Unscented Kalman Filter (UKF) framework that fuses uncertainties from sensor data and estimated human motion based on individual body shape. The UKF iteratively refines IMU and UWB measurements by aligning them with uncertain human motion constraints in real-time, producing optimal estimates for each. Experiments on both synthetic and real-world datasets demonstrate the effectiveness of UMotion in stabilizing sensor data and the improvement over state of the art in pose accuracy.