Revisiting Physically Realizable Adversarial Object Attack against LiDAR-based Detection: Clarifying Problem Formulation and Experimental Protocols

📅 2025-07-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing physical adversarial attacks against LiDAR-based 3D object detection suffer from low practicality, poor experimental reproducibility, and strong hardware dependency. To address these issues, this paper introduces the first device-agnostic, simulation-to-real co-evaluated standardized assessment framework. Our method formally defines protocols for physical attack modeling, perturbation injection, and effectiveness evaluation; integrates both point-cloud- and mesh-level perturbation generation; and establishes a unified testing pipeline built upon open-source benchmarks (e.g., OpenPCDet, SUSTechPOINTS). We empirically validate the real-world transferability of digital attacks on physical LiDAR systems, identifying critical factors—including viewing angle, distance, and surface material—that govern attack success. This work significantly enhances experimental consistency and result comparability, providing a reproducible, scalable, and generalizable infrastructure to advance research on adversarial robustness in 3D perception.

Technology Category

Application Category

📝 Abstract
Adversarial robustness in LiDAR-based 3D object detection is a critical research area due to its widespread application in real-world scenarios. While many digital attacks manipulate point clouds or meshes, they often lack physical realizability, limiting their practical impact. Physical adversarial object attacks remain underexplored and suffer from poor reproducibility due to inconsistent setups and hardware differences. To address this, we propose a device-agnostic, standardized framework that abstracts key elements of physical adversarial object attacks, supports diverse methods, and provides open-source code with benchmarking protocols in simulation and real-world settings. Our framework enables fair comparison, accelerates research, and is validated by successfully transferring simulated attacks to a physical LiDAR system. Beyond the framework, we offer insights into factors influencing attack success and advance understanding of adversarial robustness in real-world LiDAR perception.
Problem

Research questions and friction points this paper is trying to address.

Addressing lack of physical realizability in LiDAR adversarial attacks
Standardizing frameworks for reproducible physical adversarial object attacks
Enhancing understanding of real-world LiDAR adversarial robustness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Device-agnostic standardized adversarial attack framework
Open-source code with simulation and real-world benchmarking
Validated simulated-to-physical LiDAR attack transfer
L
Luo Cheng
University of Chinese Academy of Sciences, Beijing, China
Hanwei Zhang
Hanwei Zhang
Saarland University;
Trustworthy Machine LearningInterpretabilityAdversarial Machine LearningDeep Learning
L
Lijun Zhang
Key Laboratory of System Software (Chinese Academy of Sciences) and State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences, Beijing, China
Holger Hermanns
Holger Hermanns
Professor of Computer Science, Saarland University, Saarland Informatics Campus
Model CheckingStochastic ModellingDependabilityEnergy Informatics