Dynamic Black-box Backdoor Attacks on IoT Sensory Data

📅 2025-11-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep learning models for IoT-based sensor-driven identification systems (e.g., gait authentication, human activity recognition using IMU data) are vulnerable to stealthy backdoor attacks. Method: This paper proposes a dynamic trigger generation mechanism enabling high-stealth, low-perturbation backdoor injection under strictly black-box conditions—requiring no access to model parameters or gradients. By integrating data poisoning with adversarial perturbation principles, it designs a time-series-adaptive dynamic trigger and establishes an end-to-end attack framework. Contribution/Results: Evaluated across multiple public datasets and state-of-the-art models, the attack achieves >95% success rate while maintaining input perturbations below 0.5% L2 norm. It significantly evades existing defenses, exposing a critical AI security vulnerability in wearable devices and underscoring urgent risks in real-world edge AI deployments.

Technology Category

Application Category

📝 Abstract
Sensor data-based recognition systems are widely used in various applications, such as gait-based authentication and human activity recognition (HAR). Modern wearable and smart devices feature various built-in Inertial Measurement Unit (IMU) sensors, and such sensor-based measurements can be fed to a machine learning-based model to train and classify human activities. While deep learning-based models have proven successful in classifying human activity and gestures, they pose various security risks. In our paper, we discuss a novel dynamic trigger-generation technique for performing black-box adversarial attacks on sensor data-based IoT systems. Our empirical analysis shows that the attack is successful on various datasets and classifier models with minimal perturbation on the input data. We also provide a detailed comparative analysis of performance and stealthiness to various other poisoning techniques found in backdoor attacks. We also discuss some adversarial defense mechanisms and their impact on the effectiveness of our trigger-generation technique.
Problem

Research questions and friction points this paper is trying to address.

Dynamic black-box backdoor attacks on IoT sensor recognition systems
Novel trigger-generation technique for adversarial attacks on sensor data
Evaluating attack effectiveness across datasets with minimal input perturbation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic trigger-generation for black-box adversarial attacks
Minimal perturbation on sensor input data
Successful across various datasets and classifier models
🔎 Similar Papers
No similar papers found.