π€ AI Summary
This work addresses the challenge that single robots struggle to acquire, organize, and switch among diverse skills in real time within complex tasks and dynamic environments. Building upon the OpenClaw framework and the Unitree Go2 quadrupedal robot, the authors develop an embodied intelligence system featuring a customizable skill library, a task-driven skill scheduler, and a self-learning mechanism informed by human feedback. This architecture enables plug-and-play integration, autonomous validation, and continuous refinement of skills. The system interprets natural language instructions to facilitate intuitive interaction for non-expert users via the Feishu platform and autonomously verifies and switches skills in real-world settings, significantly enhancing the robotβs adaptability and usability.
π Abstract
Adaptation to complex tasks and multiple scenarios remains a significant challenge for a single robot agent. The ability to acquire organize, and switch between a wide range of skills in real time, particularly in dynamic environments, has become a fundamental requirement for embodied intelligence. We introduce OpenGo, an OpenClaw-powered embodied robotic dog capable of switching skills in real time according to the scene and task instructions. Specifically, the agent is equipped with (1) a customizable skill library with easy skill import and autonomous skill validation, (2) a dispatcher that selects and invokes different skills according to task prompts or language instructions, and (3) a self-learning framework that fine-tunes skills based on task completion and human feedback. We deploy the agent in Unitree's Go2 robotic dog and validate its capabilities in self-checking and switching of skills autonomously. In addition, by integrating Feishu-platform communication, we enable natural-language guidance and human feedback, allowing inexperienced users to control the robotic dog through simple instructions.