Are you a robot? Detecting Autonomous Vehicles from Behavior Analysis

📅 2024-03-14
🏛️ IEEE International Conference on Robotics and Automation
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
To address traffic regulation challenges during the transitional period of mixed human-driven and autonomous vehicle operation, this paper proposes a passive driver identification and behavioral profiling method that relies solely on onboard monocular video and vehicle motion-state data—without requiring active vehicle-side identification. We introduce an end-to-end vision–state fusion framework and pioneer a collaborative crowdsourced paradigm for driving behavior discrimination. We release NexusStreet, the first controllable simulation benchmark for this task. Our approach employs a lightweight CNN–LSTM architecture for multimodal temporal modeling, integrated with driving-behavior feature extraction and contrastive learning to enhance robustness under suboptimal sensing conditions. Experiments demonstrate an 80% identification accuracy using video alone, improving to 93% when fused with motion-state data; critically, the model retains strong discriminative capability even under sensor degradation.

Technology Category

Application Category

📝 Abstract
The tremendous hype around autonomous driving is eagerly calling for emerging and novel technologies to support advanced mobility use cases. As car manufactures keep developing SAE level 3+ systems to improve the safety and comfort of passengers, traffic authorities need to establish new procedures to manage the transition from human-driven to fully-autonomous vehicles while providing a feedback-loop mechanism to fine-tune envisioned autonomous systems. Thus, a way to automatically profile autonomous vehicles and differentiate those from human-driven ones is a must.In this paper, we present a fully-fledged framework that monitors active vehicles using camera images and state information in order to determine whether vehicles are autonomous, without requiring any active notification from the vehicles themselves. Essentially, it builds on the cooperation among vehicles, which share their data acquired on the road feeding a machine learning model to identify autonomous cars. We extensively tested our solution and created the NexusStreet dataset, by means of the CARLA simulator, employing an autonomous driving control agent and a steering wheel maneuvered by licensed drivers. Experiments show it is possible to discriminate the two behaviors by analyzing video clips with an accuracy of ~ 80%, which improves up to ~ 93% when the target’s state information is available. Lastly, we deliberately degraded the state to observe how the framework performs under non-ideal data collection conditions.
Problem

Research questions and friction points this paper is trying to address.

Detect autonomous vehicles via behavior analysis without vehicle notifications
Differentiate autonomous from human-driven cars using camera and state data
Test framework accuracy under varying data collection conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses camera images and state data for vehicle monitoring
Employs machine learning on shared vehicle data for identification
Tested with simulated autonomous and human-driven scenarios
🔎 Similar Papers
No similar papers found.
Fabio Maresca
Fabio Maresca
Early Stage Researcher, NEC Laboratories Europe
Applied AISmart CitiesAutonomous SystemsWireless Networks
Filippo Grazioli
Filippo Grazioli
NEC Laboratories Europe GmbH
Antonio Albanese
Antonio Albanese
Flyhound Co.
V
Vincenzo Sciancalepore
NEC Laboratories Europe GmbH
G
Gianpiero Negri
Amazon Global Robotics - EU Innovation Lab
X
Xavier Pérez Costa
i2CAT Foundation, NEC Laboratories Europe GmbH and ICREA