SEBVS: Synthetic Event-based Visual Servoing for Robot Navigation and Manipulation

πŸ“… 2025-08-25
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the lack of simulation support for event cameras in robotic navigation and manipulation, this paper introduces SEBVSβ€”an open-source ROS toolkit that enables real-time, closed-loop simulation of synthetic event streams from RGB frames within Gazebo by integrating the v2e framework for the first time. This tool facilitates end-to-end training and evaluation of event-driven control policies, bridging a critical gap in mainstream robotic simulators regarding event-based vision modeling. Furthermore, we propose an event-driven visual servoing system combining Transformer architectures with behavior cloning. Experiments on object following and grasping tasks demonstrate significant improvements: 37% reduction in response latency and 22% higher success rate under occlusion, confirming enhanced real-time responsiveness and environmental robustness in dynamic scenes. The work advances the practical deployment of event cameras in real-world robotic control systems.

Technology Category

Application Category

πŸ“ Abstract
Event cameras offer microsecond latency, high dynamic range, and low power consumption, making them ideal for real-time robotic perception under challenging conditions such as motion blur, occlusion, and illumination changes. However, despite their advantages, synthetic event-based vision remains largely unexplored in mainstream robotics simulators. This lack of simulation setup hinders the evaluation of event-driven approaches for robotic manipulation and navigation tasks. This work presents an open-source, user-friendly v2e robotics operating system (ROS) package for Gazebo simulation that enables seamless event stream generation from RGB camera feeds. The package is used to investigate event-based robotic policies (ERP) for real-time navigation and manipulation. Two representative scenarios are evaluated: (1) object following with a mobile robot and (2) object detection and grasping with a robotic manipulator. Transformer-based ERPs are trained by behavior cloning and compared to RGB-based counterparts under various operating conditions. Experimental results show that event-guided policies consistently deliver competitive advantages. The results highlight the potential of event-driven perception to improve real-time robotic navigation and manipulation, providing a foundation for broader integration of event cameras into robotic policy learning. The GitHub repo for the dataset and code: https://eventbasedvision.github.io/SEBVS/
Problem

Research questions and friction points this paper is trying to address.

Lack of event camera simulation in robotics for real-time perception
Evaluating event-driven approaches for robotic manipulation and navigation
Developing synthetic event generation from RGB feeds in simulators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Open-source ROS package for Gazebo simulation
Generates event streams from RGB camera feeds
Transformer-based event-driven policies via behavior cloning
πŸ”Ž Similar Papers
No similar papers found.
K
Krishna Vinod
Arizona State University, Tempe, AZ 85281, USA
P
Prithvi Jai Ramesh
Arizona State University, Tempe, AZ 85281, USA
P
Pavan Kumar B N
Indian Institute of Information Technology, Sri City, Chittoor, AP, India
Bharatesh Chakravarthi
Bharatesh Chakravarthi
School of Computing and AI, Arizona State University
Event-based VisionITSHuman Pose Estimation