ES-Parkour: Advanced Robot Parkour with Bio-inspired Event Camera and Spiking Neural Network

📅 2025-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Quadrupedal robots face significant challenges in野外 high-dynamic obstacle traversal due to large latency and poor illumination robustness of conventional visual sensors (e.g., depth cameras), as well as the high computational cost and power consumption of deep neural networks. Method: This work pioneers the tight integration of bio-inspired event cameras with spiking neural networks (SNNs) to establish a low-latency, low-power, high-dynamic-range perception–decision closed-loop system. Our approach synergistically combines event-driven sensing, SNN-based sparse computation, reinforcement learning–based control, and neuromorphic hardware architecture. Results: In complex park environments executing the “our” task, the proposed system achieves only 11.7% of the energy consumption of an equivalent artificial neural network (ANN) baseline—reducing power usage by 88.3%. It markedly improves real-time performance and environmental adaptability, thereby overcoming critical deployment bottlenecks of traditional approaches in outdoor settings.

Technology Category

Application Category

📝 Abstract
In recent years, quadruped robotics has advanced significantly, particularly in perception and motion control via reinforcement learning, enabling complex motions in challenging environments. Visual sensors like depth cameras enhance stability and robustness but face limitations, such as low operating frequencies relative to joint control and sensitivity to lighting, which hinder outdoor deployment. Additionally, deep neural networks in sensor and control systems increase computational demands. To address these issues, we introduce spiking neural networks (SNNs) and event cameras to perform a challenging quadruped parkour task. Event cameras capture dynamic visual data, while SNNs efficiently process spike sequences, mimicking biological perception. Experimental results demonstrate that this approach significantly outperforms traditional models, achieving excellent parkour performance with just 11.7% of the energy consumption of an artificial neural network (ANN)-based model, yielding an 88.3% energy reduction. By integrating event cameras with SNNs, our work advances robotic reinforcement learning and opens new possibilities for applications in demanding environments.
Problem

Research questions and friction points this paper is trying to address.

Overcome limitations of traditional visual sensors in robotics.
Reduce computational demands of deep neural networks in robots.
Enhance energy efficiency in robotic parkour tasks.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses event cameras for dynamic visual data capture
Implements spiking neural networks for efficient processing
Achieves significant energy reduction in robotic tasks
🔎 Similar Papers
No similar papers found.
Q
Qiang Zhang
Microelectronics Thrust, The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China
Jiahang Cao
Jiahang Cao
The University of Hong Kong
Robot LearningGenerative ModelsCognitive-inspired Models
J
Jingkai Sun
Microelectronics Thrust, The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China
Gang Han
Gang Han
Professor of Biostatistics, Texas A&M University
StatisticsBiostatisticsMedical researchComputer experiments
Wen Zhao
Wen Zhao
JSPS International Fellow, UT-Austin Postdoc, KAUST
MEMSSensorNonlinear Dynamics
Y
Yijie Guo
Beijing Innovation Center of Humanoid Robotics Co., Ltd.
Renjing Xu
Renjing Xu
HKUST(GZ)
Brain-inspired ComputingHumanoid Computing