Runtime Anomaly Detection for Drones: An Integrated Rule-Mining and Unsupervised-Learning Approach

📅 2025-05-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing LSTM-based anomaly detection methods for multi-sensor fusion in UAV operations suffer from limited task generalizability, poor interpretability, and inadequate incorporation of domain knowledge—particularly under sensor faults posing safety risks. To address these limitations, this paper proposes a rule-driven and unsupervised learning–cooperative real-time anomaly detection framework. First, 44 phase-aware invariant rules are mined from physical constraints and mission logs, explicitly encoding domain expertise. Second, an ensemble of five unsupervised models—including Isolation Forest and Autoencoder—is integrated with the rule-based module to enable joint data- and rule-driven decision-making. Evaluated on the ArduPilot/Gazebo simulation platform, the method achieves 93.84% detection rate and 2.33% false positive rate across six representative fault types, significantly outperforming LSTM baselines. The framework delivers high accuracy, strong interpretability via human-readable rules, and lightweight real-time deployability.

Technology Category

Application Category

📝 Abstract
UAVs, commonly referred to as drones, have witnessed a remarkable surge in popularity due to their versatile applications. These cyber-physical systems depend on multiple sensor inputs, such as cameras, GPS receivers, accelerometers, and gyroscopes, with faults potentially leading to physical instability and serious safety concerns. To mitigate such risks, anomaly detection has emerged as a crucial safeguarding mechanism, capable of identifying the physical manifestations of emerging issues and allowing operators to take preemptive action at runtime. Recent anomaly detection methods based on LSTM neural networks have shown promising results, but three challenges persist: the need for models that can generalise across the diverse mission profiles of drones; the need for interpretability, enabling operators to understand the nature of detected problems; and the need for capturing domain knowledge that is difficult to infer solely from log data. Motivated by these challenges, this paper introduces RADD, an integrated approach to anomaly detection in drones that combines rule mining and unsupervised learning. In particular, we leverage rules (or invariants) to capture expected relationships between sensors and actuators during missions, and utilise unsupervised learning techniques to cover more subtle relationships that the rules may have missed. We implement this approach using the ArduPilot drone software in the Gazebo simulator, utilising 44 rules derived across the main phases of drone missions, in conjunction with an ensemble of five unsupervised learning models. We find that our integrated approach successfully detects 93.84% of anomalies over six types of faults with a low false positive rate (2.33%), and can be deployed effectively at runtime. Furthermore, RADD outperforms a state-of-the-art LSTM-based method in detecting the different types of faults evaluated in our study.
Problem

Research questions and friction points this paper is trying to address.

Detecting runtime anomalies in drones using integrated rule-mining and unsupervised learning
Addressing challenges in generalizability, interpretability, and domain knowledge capture for drone anomaly detection
Improving anomaly detection accuracy and reducing false positives in drone operations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines rule mining and unsupervised learning
Uses invariants for sensor-actuator relationships
Integrates ensemble of five unsupervised models
🔎 Similar Papers
No similar papers found.