๐ค AI Summary
Time-varying biases in low-cost IMUs severely degrade the robustness of visual-inertial odometry (VIO), particularly causing bias estimation drift and error accumulation under visual degeneracy. To address this, we propose Inertial Prior Network (IPNet), a plug-and-play module that enables end-to-end, non-recurrent estimation of raw IMU mean biasโwithout reliance on historical states or motion models. To overcome the absence of ground-truth bias labels, we introduce an iterative mean-statistics modeling strategy, enabling self-supervised, out-of-the-box deployment. Evaluated on three standard benchmarks, integrating IPNet reduces VIOโs absolute trajectory error (ATE-RMSE) by 46% on average, significantly improving both localization accuracy and system robustness.
๐ Abstract
The bias of low-cost Inertial Measurement Units (IMU) is a critical factor affecting the performance of Visual-Inertial Odometry (VIO). In particular, when visual tracking encounters errors, the optimized bias results may deviate significantly from the true values, adversely impacting the system's stability and localization precision. In this paper, we propose a novel plug-and-play framework featuring the Inertial Prior Network (IPNet), which is designed to accurately estimate IMU bias. Recognizing the substantial impact of initial bias errors in low-cost inertial devices on system performance, our network directly leverages raw IMU data to estimate the mean bias, eliminating the dependency on historical estimates in traditional recursive predictions and effectively preventing error propagation. Furthermore, we introduce an iterative approach to calculate the mean value of the bias for network training, addressing the lack of bias labels in many visual-inertial datasets. The framework is evaluated on two public datasets and one self-collected dataset. Extensive experiments demonstrate that our method significantly enhances both localization precision and robustness, with the ATE-RMSE metric improving on average by 46%. The source code and video will be available at extcolor{red}{https://github.com/yiyscut/VIO-IPNet.git}.