🤖 AI Summary
In lower-limb exoskeletons, narrow field-of-view (FoV) depth sensors—constrained by mechanical mounting—suffer severe motion-induced distortion, leading to significant odometry drift. To address this, we propose an EKF-ICP collaborative fusion framework that tightly integrates narrow-FoV depth point clouds with exoskeleton proprioceptive measurements (joint encoders and IMU). State estimation is performed via an extended Kalman filter (EKF), while a customized iterative closest point (ICP) algorithm enhances point cloud registration robustness under strong motion disturbances, enabling high-fidelity terrain elevation mapping. Experimental results demonstrate a 62% reduction in horizontal pose drift compared to a pure proprioceptive baseline; moreover, the generated elevation maps exhibit substantially improved completeness and geometric consistency over conventional point cloud mapping approaches.
📝 Abstract
For leg exoskeletons to operate effectively in real-world environments, they must be able to perceive and understand the terrain around them. However, unlike other legged robots, exoskeletons face specific constraints on where depth sensors can be mounted due to the presence of a human user. These constraints lead to a limited Field Of View (FOV) and greater sensor motion, making odometry particularly challenging. To address this, we propose a novel odometry algorithm that integrates proprioceptive data from the exoskeleton with point clouds from a depth camera to produce accurate elevation maps despite these limitations. Our method builds on an extended Kalman filter (EKF) to fuse kinematic and inertial measurements, while incorporating a tailored iterative closest point (ICP) algorithm to register new point clouds with the elevation map. Experimental validation with a leg exoskeleton demonstrates that our approach reduces drift and enhances the quality of elevation maps compared to a purely proprioceptive baseline, while also outperforming a more traditional point cloud map-based variant.