🤖 AI Summary
This work proposes a short-to-medium-term trajectory prediction method for drones using only event camera data, addressing the limitations of existing approaches that struggle to exploit propeller motion cues and often rely on RGB images or large training datasets. The key innovation lies in introducing propeller rotational speed (RPM) as a dynamic prior: RPM is estimated directly from raw event streams and integrated into an RPM-modulated Kalman filter, enabling high-precision prediction without learned models or conventional visual inputs. Evaluated on the FRED dataset, the proposed method achieves superior performance in both average displacement error and final displacement error at 0.4-second and 0.8-second prediction horizons, outperforming learning-based approaches and standard Kalman filtering, thereby demonstrating remarkable accuracy and robustness.
📝 Abstract
Event cameras provide high-temporal-resolution visual sensing that is well suited for observing fast-moving aerial objects; however, their use for drone trajectory prediction remains limited. This work introduces an event-only drone forecasting method that exploits propeller-induced motion cues. Propeller rotational speed are extracted directly from raw event data and fused within an RPM-aware Kalman filtering framework. Evaluations on the FRED dataset show that the proposed method outperforms learning-based approaches and vanilla kalman filter in terms of average distance error and final distance error at 0.4s and 0.8s forecasting horizons. The results demonstrate robust and accurate short- and medium-horizon trajectory forecasting without reliance on RGB imagery or training data.