🤖 AI Summary
Robust manipulation of diverse doors—varying in type, opening direction, and mechanical properties—in unstructured environments remains challenging due to insufficient adaptability and generalization. Method: This paper proposes a tactile-driven hierarchical closed-loop control framework integrating high-fidelity force/torque sensing, adaptive impedance control, semantic-guided high-level task planning, and low-level motion servoing, with online policy re-planning capability. Unlike conventional vision-centric or open-loop approaches, it leverages tactile feedback as the primary modality for zero-shot generalization, enabling autonomous adaptation to unseen doors without prior modeling. Contribution/Results: Evaluated on 20 real-world doors across multiple buildings and morphologies, the framework achieves a 90% door-opening success rate, significantly enhancing the generality and reliability of articulated object manipulation in open-world settings.
📝 Abstract
Robots operating in unstructured environments face significant challenges when interacting with everyday objects like doors. They particularly struggle to generalize across diverse door types and conditions. Existing vision-based and open-loop planning methods often lack the robustness to handle varying door designs, mechanisms, and push/pull configurations. In this work, we propose a haptic-aware closed-loop hierarchical control framework that enables robots to explore and open different unseen doors in the wild. Our approach leverages real-time haptic feedback, allowing the robot to adjust its strategy dynamically based on force feedback during manipulation. We test our system on 20 unseen doors across different buildings, featuring diverse appearances and mechanical types. Our framework achieves a 90% success rate, demonstrating its ability to generalize and robustly handle varied door-opening tasks. This scalable solution offers potential applications in broader open-world articulated object manipulation tasks.