Event-based vision for egomotion estimation using precise event timing

📅 2025-01-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of low-latency, low-power, and environment-robust egomotion estimation in long-term, dynamic scenarios for autonomous driving and robotics, this paper proposes the first end-to-end event-driven spiking neural network (SNN) framework. It encodes precise event timestamps as spike bursts and employs synaptically gated shallow SNNs to directly decode local optical flow and global egomotion from raw asynchronous event streams—eliminating frame-based intermediate representations entirely. The method enables on-chip real-time deployment with microsecond-level latency on dedicated neuromorphic hardware. Extensive large-scale simulations demonstrate state-of-the-art egomotion estimation accuracy in the event-camera domain, significantly outperforming both IMU-based and frame-based visual approaches, while reducing power consumption by one to two orders of magnitude.

Technology Category

Application Category

📝 Abstract
Egomotion estimation is crucial for applications such as autonomous navigation and robotics, where accurate and real-time motion tracking is required. However, traditional methods relying on inertial sensors are highly sensitive to external conditions, and suffer from drifts leading to large inaccuracies over long distances. Vision-based methods, particularly those utilising event-based vision sensors, provide an efficient alternative by capturing data only when changes are perceived in the scene. This approach minimises power consumption while delivering high-speed, low-latency feedback. In this work, we propose a fully event-based pipeline for egomotion estimation that processes the event stream directly within the event-based domain. This method eliminates the need for frame-based intermediaries, allowing for low-latency and energy-efficient motion estimation. We construct a shallow spiking neural network using a synaptic gating mechanism to convert precise event timing into bursts of spikes. These spikes encode local optical flow velocities, and the network provides an event-based readout of egomotion. We evaluate the network's performance on a dedicated chip, demonstrating strong potential for low-latency, low-power motion estimation. Additionally, simulations of larger networks show that the system achieves state-of-the-art accuracy in egomotion estimation tasks with event-based cameras, making it a promising solution for real-time, power-constrained robotics applications.
Problem

Research questions and friction points this paper is trying to address.

Autonomous Driving
Robotics
Self-motion Tracking
Innovation

Methods, ideas, or system contributions that make the work stand out.

Event-Driven Method
Low-Power Consumption
Real-Time Self-Motion Estimation
🔎 Similar Papers
No similar papers found.
H
Hugh Greatorex
Bio-Inspired Circuits and Systems (BICS) Lab, Zernike Institute for Advanced Materials, University of Groningen, Netherlands; Groningen Cognitive Systems and Materials Center (CogniGron), University of Groningen, Netherlands
M
Michele Mastella
Neuronova Ltd., Italy
M
Madison Cotteret
Bio-Inspired Circuits and Systems (BICS) Lab, Zernike Institute for Advanced Materials, University of Groningen, Netherlands; Groningen Cognitive Systems and Materials Center (CogniGron), University of Groningen, Netherlands; Micro- and Nanoelectronic Systems (MNES), Technische Universität Ilmenau, Germany
O
Ole Richter
Asynchronous VLSI and Architecture Group, School of Engineering & Applied Science (SEAS), Yale University, CT, USA
Elisabetta Chicca
Elisabetta Chicca
Zernike Institute for Advanced Materials and CogniGron Center, University of Groningen
neuromorphic engineering