Vision-Guided Loco-Manipulation with a Snake Robot

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of autonomous locomotion-manipulation for the COBRA snake robot in unstructured environments, this work introduces the first end-to-end vision-guided locomanipulation system. Methodologically, it integrates YOLOv8 for real-time object detection with binocular stereo vision, proposes a lightweight 6-DOF pose estimation algorithm, and designs a closed-loop feedback control strategy based on kinematic modeling. This enables, for the first time, full 6-DOF closed-loop locomotion-manipulation coordination on a snake robot. Experiments in realistic settings demonstrate robust performance across object detection, 6-DOF pose estimation, grasping, transport, and millimeter-precision placement—achieving >15 FPS while significantly enhancing task robustness and real-time responsiveness. This work establishes a scalable, perception-planning-control integrated paradigm for autonomous operation of soft and hyper-redundant robots.

Technology Category

Application Category

📝 Abstract
This paper presents the development and integration of a vision-guided loco-manipulation pipeline for Northeastern University's snake robot, COBRA. The system leverages a YOLOv8-based object detection model and depth data from an onboard stereo camera to estimate the 6-DOF pose of target objects in real time. We introduce a framework for autonomous detection and control, enabling closed-loop loco-manipulation for transporting objects to specified goal locations. Additionally, we demonstrate open-loop experiments in which COBRA successfully performs real-time object detection and loco-manipulation tasks.
Problem

Research questions and friction points this paper is trying to address.

Develop vision-guided loco-manipulation for snake robot COBRA
Estimate 6-DOF object pose using YOLOv8 and stereo camera
Enable autonomous object transport to goal locations
Innovation

Methods, ideas, or system contributions that make the work stand out.

YOLOv8-based real-time object detection
Stereo camera for 6-DOF pose estimation
Closed-loop autonomous loco-manipulation control
🔎 Similar Papers
No similar papers found.
Adarsh Salagame
Adarsh Salagame
PhD Candidate, Northeastern University
Mobile RoboticsSpace RoboticsUAVLocomotion ControlPerception
S
Sasank Potluri
Department of Electrical and Computer Engineering, Northeastern University, Boston MA
K
Keshav Bharadwaj Vaidyanathan
Department of Electrical and Computer Engineering, Northeastern University, Boston MA
Kruthika Gangaraju
Kruthika Gangaraju
PhD Student, Worcester Polytechnic Institute
Robotics
E
Eric Sihite
California Institute of Technology, Pasadena CA
Milad Ramezani
Milad Ramezani
Team Leader | Senior Research Scientist, CSIRO Data61
SLAMRoboticsMachine Learning
Alireza Ramezani
Alireza Ramezani
Associate Professor, ECE, Northeastern University
Bioinspired RoboticsLegged Locomotion