🤖 AI Summary
Existing learning-based inertial odometry methods suffer significant performance degradation on quadrupedal robots due to their highly dynamic motions. To address this, we propose the first cross-platform single-IMU inertial odometry framework jointly designed for pedestrians and quadrupedal robots. Our method introduces three key innovations: (1) a platform-adaptive expert selection module that dynamically routes inputs to pedestrian- or quadruped-specific subnetworks; (2) a two-stage attention mechanism jointly modeling long-term temporal dependencies and triaxial coupling relationships; and (3) a displacement uncertainty estimation module enabling robust EKF fusion. Evaluated on public pedestrian datasets and a newly collected quadrupedal robot dataset, our approach reduces absolute trajectory error (ATE) by 14.3% and 52.8%, and relative translation error (RTE) by 11.4% and 41.3%, respectively. These results demonstrate substantial improvements in cross-platform generalization and state estimation accuracy.
📝 Abstract
Learning-based inertial odometry has achieved remarkable progress in pedestrian navigation. However, extending these methods to quadruped robots remains challenging due to their distinct and highly dynamic motion patterns. Models that perform well on pedestrian data often experience severe degradation when deployed on legged platforms. To tackle this challenge, we introduce X-IONet, a cross-platform inertial odometry framework that operates solely using a single Inertial Measurement Unit (IMU). X-IONet incorporates a rule-based expert selection module to classify motion platforms and route IMU sequences to platform-specific expert networks. The displacement prediction network features a dual-stage attention architecture that jointly models long-range temporal dependencies and inter-axis correlations, enabling accurate motion representation. It outputs both displacement and associated uncertainty, which are further fused through an Extended Kalman Filter (EKF) for robust state estimation. Extensive experiments on public pedestrian datasets and a self-collected quadruped robot dataset demonstrate that X-IONet achieves state-of-the-art performance, reducing Absolute Trajectory Error (ATE) by 14.3% and Relative Trajectory Error (RTE) by 11.4% on pedestrian data, and by 52.8% and 41.3% on quadruped robot data. These results highlight the effectiveness of X-IONet in advancing accurate and robust inertial navigation across both human and legged robot platforms.