PADetBench: Towards Benchmarking Physical Attacks against Object Detection

๐Ÿ“… 2024-08-17
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Physical-domain object detection adversarial attacks have long suffered from evaluation bottlenecks: time-consuming real-world experiments, irreproducibility, and domain misalignment, hindering fair robustness comparison. To address this, we introduce the first standardized simulation benchmark for physical-domain detection attacks, pioneering an evaluation paradigm that jointly models controllable physical dynamics and cross-domain transformations. Our framework integrates 20 attack methods, 48 detectors, and an end-to-end analytical pipeline, leveraging a high-fidelity physics engine to model rendering, sensor characteristics, motion dynamics, and domain shiftsโ€”enabling fully automated adversarial image generation and attribution analysis. We conduct 8,064 systematic evaluations, identifying key factors governing attack efficacy and model robustness. The resulting protocol ensures reproducibility, comparability, and interpretability, establishing a unified benchmark and methodological foundation for advancing physically robust detection models.

Technology Category

Application Category

๐Ÿ“ Abstract
Physical attacks against object detection have gained increasing attention due to their significant practical implications. However, conducting physical experiments is extremely time-consuming and labor-intensive. Moreover, physical dynamics and cross-domain transformation are challenging to strictly regulate in the real world, leading to unaligned evaluation and comparison, severely hindering the development of physically robust models. To accommodate these challenges, we explore utilizing realistic simulation to thoroughly and rigorously benchmark physical attacks with fairness under controlled physical dynamics and cross-domain transformation. This resolves the problem of capturing identical adversarial images that cannot be achieved in the real world. Our benchmark includes 20 physical attack methods, 48 object detectors, comprehensive physical dynamics, and evaluation metrics. We also provide end-to-end pipelines for dataset generation, detection, evaluation, and further analysis. In addition, we perform 8064 groups of evaluation based on our benchmark, which includes both overall evaluation and further detailed ablation studies for controlled physical dynamics. Through these experiments, we provide in-depth analyses of physical attack performance and physical adversarial robustness, draw valuable observations, and discuss potential directions for future research. Codebase: https://github.com/JiaweiLian/Benchmarking_Physical_Attack
Problem

Research questions and friction points this paper is trying to address.

Benchmarking physical attacks on object detection models
Addressing unaligned evaluation due to real-world physical dynamics
Providing controlled simulation for fair adversarial robustness assessment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Utilizes realistic simulation for benchmarking
Includes 20 attack methods and 48 detectors
Provides end-to-end pipelines for evaluation
๐Ÿ”Ž Similar Papers
No similar papers found.