VALISENS: A Validated Innovative Multi-Sensor System for Cooperative Automated Driving

📅 2025-05-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient robustness of automated vehicles in detecting vulnerable road users (VRUs) within complex urban environments—particularly under occlusion and limited perceptual range—this paper proposes a vehicle-infrastructure cooperative, distributed multi-agent perception system. The method introduces a novel cross-vehicle–infrastructure heterogeneous multimodal sensor fusion architecture, integrating onboard and roadside LiDAR, millimeter-wave radar, thermal imaging, and RGB cameras; thermal imaging specifically provides redundant VRU detection, while roadside sensing significantly extends the field of view and mitigates dynamic occlusions. Leveraging V2X communication, multi-source heterogeneous data fusion, joint object detection, and motion prediction, the system achieves a 23.6% improvement in VRU detection accuracy and a 41.2% enhancement in tracking continuity in real-world road tests. These results establish an engineering-ready, cooperative perception foundation for cooperative intelligent transportation systems (C-ITS).

Technology Category

Application Category

📝 Abstract
Perception is a core capability of automated vehicles and has been significantly advanced through modern sensor technologies and artificial intelligence. However, perception systems still face challenges in complex real-world scenarios. To improve robustness against various external factors, multi-sensor fusion techniques are essential, combining the strengths of different sensor modalities. With recent developments in Vehicle-to-Everything (V2X communication, sensor fusion can now extend beyond a single vehicle to a cooperative multi-agent system involving Connected Automated Vehicle (CAV) and intelligent infrastructure. This paper presents VALISENS, an innovative multi-sensor system distributed across multiple agents. It integrates onboard and roadside LiDARs, radars, thermal cameras, and RGB cameras to enhance situational awareness and support cooperative automated driving. The thermal camera adds critical redundancy for perceiving Vulnerable Road User (VRU), while fusion with roadside sensors mitigates visual occlusions and extends the perception range beyond the limits of individual vehicles. We introduce the corresponding perception module built on this sensor system, which includes object detection, tracking, motion forecasting, and high-level data fusion. The proposed system demonstrates the potential of cooperative perception in real-world test environments and lays the groundwork for future Cooperative Intelligent Transport Systems (C-ITS) applications.
Problem

Research questions and friction points this paper is trying to address.

Enhancing robustness in automated driving perception systems
Integrating multi-sensor fusion across vehicles and infrastructure
Improving detection of Vulnerable Road Users and occlusion handling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-sensor fusion across vehicles and infrastructure
Integration of LiDARs, radars, and thermal cameras
Cooperative perception for occlusion mitigation
Lei Wan
Lei Wan
Xiamen University
underwater acoustic communications
P
Prabesh Gupta
LiangDao GmbH, Rosa-Bavarese-Str. 3, 80639 Munich, Germany
A
Andreas Eich
LiangDao GmbH, Rosa-Bavarese-Str. 3, 80639 Munich, Germany
M
Marcel Kettelgerdes
Fraunhofer IVI, Institute for Transportation and Infrastructure Systems, Zeunerstr. 38, 01069 Dresden, Germany
Hannan Ejaz Keen
Hannan Ejaz Keen
Senior Researcher at Xitaso Gmbh.
RoboticsReinforcement LearningMappingNavigation
M
Michael Kloppel-Gersdorf
Fraunhofer IVI, Institute for Transportation and Infrastructure Systems, Zeunerstr. 38, 01069 Dresden, Germany
Alexey Vinel
Alexey Vinel
Karlsruhe Institute of Technology, Germany & Halmstad University, Sweden
vehicular networksnetworked systemsautonomous vehiclescyber-physical systems