π€ AI Summary
This work addresses the challenge of accurately estimating process and measurement noise covariances in Kalman filtering when measurements are corrupted by outliers. The standard autocovariance least-squares (ALS) method is highly sensitive to such anomalies, leading to biased covariance estimates. To overcome this limitation, the authors propose a robust ALS-IRLS algorithm that integrates an innovation-based adaptive thresholding mechanism with a Huber cost function derived from an Ξ΅-contamination model. This two-stage strategy first identifies and removes anomalous innovations via adaptive thresholding, then refines the covariance estimates through iteratively reweighted least squares. Experimental results demonstrate that the proposed method reduces the root-mean-square error (RMSE) of noise covariance estimation by more than two orders of magnitude compared to standard ALS, achieving state estimation accuracy approaching the ideal Oracle CramΓ©rβRao lower bound and significantly outperforming existing robust Kalman filtering approaches.
π Abstract
The autocovariance least squares (ALS) method is a computationally efficient approach for estimating noise covariances in Kalman filters without requiring specific noise models. However, conventional ALS and its variants rely on the classic least mean squares (LMS) criterion, making them highly sensitive to measurement outliers and prone to severe performance degradation. To overcome this limitation, this paper proposes a novel outlier-robust ALS algorithm, termed ALS-IRLS, based on the iteratively reweighted least squares (IRLS) framework. Specifically, the proposed approach introduces a two-tier robustification strategy. First, an innovation-level adaptive thresholding mechanism is employed to filter out heavily contaminated data. Second, the outlier-contaminated autocovariance is formulated using an $\epsilon$-contamination model, where the standard LMS criterion is replaced by the Huber cost function. The IRLS method is then utilized to iteratively adjust data weights based on estimation deviations, effectively mitigating the influence of residual outliers. Comparative simulations demonstrate that ALS-IRLS reduces the root-mean-square error (RMSE) of noise covariance estimates by over two orders of magnitude compared to standard ALS. Furthermore, it significantly enhances downstream state estimation accuracy, outperforming existing outlier-robust Kalman filters and achieving performance nearly equivalent to the ideal Oracle lower bound in the presence of noisy and anomalous data.