Adversarial Attacks on Event-Based Pedestrian Detectors: A Physical Approach

📅 2025-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work presents the first systematic study of physical-world adversarial attacks against event-camera-based pedestrian detectors, focusing on the impact of adversarial clothing textures. We propose an end-to-end physical adversarial attack framework that jointly optimizes 2D texture via backpropagation, models event-stream dynamics, and integrates digital-physical domain co-validation—enabling a closed-loop pipeline from digital texture generation to real-world garment fabrication. Experiments demonstrate that the generated adversarial textures significantly degrade the performance of state-of-the-art event-driven detectors in real scenes, reducing mAP by up to 76%. This constitutes the first empirical evidence of critical physical-domain vulnerability in event-based vision models, establishing a foundational benchmark for robust event-camera perception and security evaluation.

Technology Category

Application Category

📝 Abstract
Event cameras, known for their low latency and high dynamic range, show great potential in pedestrian detection applications. However, while recent research has primarily focused on improving detection accuracy, the robustness of event-based visual models against physical adversarial attacks has received limited attention. For example, adversarial physical objects, such as specific clothing patterns or accessories, can exploit inherent vulnerabilities in these systems, leading to misdetections or misclassifications. This study is the first to explore physical adversarial attacks on event-driven pedestrian detectors, specifically investigating whether certain clothing patterns worn by pedestrians can cause these detectors to fail, effectively rendering them unable to detect the person. To address this, we developed an end-to-end adversarial framework in the digital domain, framing the design of adversarial clothing textures as a 2D texture optimization problem. By crafting an effective adversarial loss function, the framework iteratively generates optimal textures through backpropagation. Our results demonstrate that the textures identified in the digital domain possess strong adversarial properties. Furthermore, we translated these digitally optimized textures into physical clothing and tested them in real-world scenarios, successfully demonstrating that the designed textures significantly degrade the performance of event-based pedestrian detection models. This work highlights the vulnerability of such models to physical adversarial attacks.
Problem

Research questions and friction points this paper is trying to address.

Explores physical adversarial attacks on event-based pedestrian detectors.
Investigates clothing patterns causing pedestrian detection failures.
Develops adversarial framework to degrade detection model performance.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Developed end-to-end adversarial framework
Optimized 2D textures for adversarial attacks
Tested physical clothing in real-world scenarios