π€ AI Summary
To address the lack of simulation support for event cameras in robotic navigation and manipulation, this paper introduces SEBVSβan open-source ROS toolkit that enables real-time, closed-loop simulation of synthetic event streams from RGB frames within Gazebo by integrating the v2e framework for the first time. This tool facilitates end-to-end training and evaluation of event-driven control policies, bridging a critical gap in mainstream robotic simulators regarding event-based vision modeling. Furthermore, we propose an event-driven visual servoing system combining Transformer architectures with behavior cloning. Experiments on object following and grasping tasks demonstrate significant improvements: 37% reduction in response latency and 22% higher success rate under occlusion, confirming enhanced real-time responsiveness and environmental robustness in dynamic scenes. The work advances the practical deployment of event cameras in real-world robotic control systems.
π Abstract
Event cameras offer microsecond latency, high dynamic range, and low power consumption, making them ideal for real-time robotic perception under challenging conditions such as motion blur, occlusion, and illumination changes. However, despite their advantages, synthetic event-based vision remains largely unexplored in mainstream robotics simulators. This lack of simulation setup hinders the evaluation of event-driven approaches for robotic manipulation and navigation tasks. This work presents an open-source, user-friendly v2e robotics operating system (ROS) package for Gazebo simulation that enables seamless event stream generation from RGB camera feeds. The package is used to investigate event-based robotic policies (ERP) for real-time navigation and manipulation. Two representative scenarios are evaluated: (1) object following with a mobile robot and (2) object detection and grasping with a robotic manipulator. Transformer-based ERPs are trained by behavior cloning and compared to RGB-based counterparts under various operating conditions. Experimental results show that event-guided policies consistently deliver competitive advantages. The results highlight the potential of event-driven perception to improve real-time robotic navigation and manipulation, providing a foundation for broader integration of event cameras into robotic policy learning. The GitHub repo for the dataset and code: https://eventbasedvision.github.io/SEBVS/