RoboPanoptes: The All-seeing Robot with Whole-body Dexterity

📅 2025-01-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address poor adaptability and insufficient robustness of multi-task robotic manipulation in complex, constrained environments, this paper proposes a novel whole-body vision-driven dexterous manipulation paradigm. We deploy a distributed whole-body camera array to establish a fault-tolerant perception architecture and a multi-view feature fusion mechanism; design an end-to-end whole-body haptic motion policy enabling coordinated contact-point control and adaptive spatial constraint handling; and incorporate imitation learning to enhance generalization. Evaluated on three representative tasks—narrow-space box opening, ultra-large/multi-object sweeping, and multi-step tidying in cluttered scenes—our system significantly outperforms baseline methods, achieving 27–43% higher success rates while maintaining strong robustness against partial sensor failures. The core contribution is the first closed-loop, whole-body vision-guided dexterous manipulation framework, overcoming the limitations of conventional end-effector-centric paradigms.

Technology Category

Application Category

📝 Abstract
We present RoboPanoptes, a capable yet practical robot system that achieves whole-body dexterity through whole-body vision. Its whole-body dexterity allows the robot to utilize its entire body surface for manipulation, such as leveraging multiple contact points or navigating constrained spaces. Meanwhile, whole-body vision uses a camera system distributed over the robot's surface to provide comprehensive, multi-perspective visual feedback of its own and the environment's state. At its core, RoboPanoptes uses a whole-body visuomotor policy that learns complex manipulation skills directly from human demonstrations, efficiently aggregating information from the distributed cameras while maintaining resilience to sensor failures. Together, these design aspects unlock new capabilities and tasks, allowing RoboPanoptes to unbox in narrow spaces, sweep multiple or oversized objects, and succeed in multi-step stowing in cluttered environments, outperforming baselines in adaptability and efficiency. Results are best viewed on https://robopanoptes.github.io.
Problem

Research questions and friction points this paper is trying to address.

Adaptive Robotics
Multi-Task Execution
Perception and Mobility
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive Robotics
Multi-task Learning
Resilient Design
🔎 Similar Papers
No similar papers found.