Physical ID-Transfer Attacks against Multi-Object Tracking via Adversarial Trajectory

📅 2025-12-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Multi-object tracking (MOT) exhibits security vulnerabilities during the data association stage, where attackers can exploit physically realizable adversarial trajectories (AdvTraj) to evade detection modules and illegitimately transfer their identity to target objects, causing ID confusion. This work proposes the first online, physically implementable ID manipulation attack that requires no modification to the detector; instead, it designs universal adversarial motion patterns to perturb mainstream association algorithms. Evaluated on the CARLA simulation platform, the attack achieves 100% success rate against SORT under both white-box and black-box settings, and attains up to 93% cross-model transferability against state-of-the-art MOT methods. Our findings expose a common vulnerability inherent in MOT association mechanisms, offering a novel perspective for robustness evaluation and establishing a benchmark adversarial paradigm for tracking systems.

Technology Category

Application Category

📝 Abstract
Multi-Object Tracking (MOT) is a critical task in computer vision, with applications ranging from surveillance systems to autonomous driving. However, threats to MOT algorithms have yet been widely studied. In particular, incorrect association between the tracked objects and their assigned IDs can lead to severe consequences, such as wrong trajectory predictions. Previous attacks against MOT either focused on hijacking the trackers of individual objects, or manipulating the tracker IDs in MOT by attacking the integrated object detection (OD) module in the digital domain, which are model-specific, non-robust, and only able to affect specific samples in offline datasets. In this paper, we present AdvTraj, the first online and physical ID-manipulation attack against tracking-by-detection MOT, in which an attacker uses adversarial trajectories to transfer its ID to a targeted object to confuse the tracking system, without attacking OD. Our simulation results in CARLA show that AdvTraj can fool ID assignments with 100% success rate in various scenarios for white-box attacks against SORT, which also have high attack transferability (up to 93% attack success rate) against state-of-the-art (SOTA) MOT algorithms due to their common design principles. We characterize the patterns of trajectories generated by AdvTraj and propose two universal adversarial maneuvers that can be performed by a human walker/driver in daily scenarios. Our work reveals under-explored weaknesses in the object association phase of SOTA MOT systems, and provides insights into enhancing the robustness of such systems.
Problem

Research questions and friction points this paper is trying to address.

Physical adversarial trajectories manipulate ID assignments in tracking-by-detection MOT systems.
Attack transfers attacker's ID to target object without attacking object detection module.
Reveals vulnerabilities in object association phase of state-of-the-art MOT algorithms.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adversarial trajectory manipulation without attacking object detection
Online physical ID-transfer attacks against tracking-by-detection MOT
Universal adversarial maneuvers applicable in real-world scenarios
🔎 Similar Papers
No similar papers found.
C
Chenyi Wang
University of Arizona
Y
Yanmao Man
HERE Technologies
R
Raymond Muller
Purdue University
M
Ming Li
University of Arizona
Z. Berkay Celik
Z. Berkay Celik
Associate Professor of Computer Science, Purdue University
Security and PrivacySystems SecurityCyber-Physical Systems Security
R
Ryan Gerdes
Virginia Tech
Jonathan Petit
Jonathan Petit
Qualcomm
Computer ScienceVehicular NetworksDistributed SystemsSecurityPrivacy