🤖 AI Summary
Autonomous stair climbing and descending by humanoid robots in unknown environments remains challenging due to limited environmental perception, inaccurate terrain modeling, and insufficient real-time whole-body motion planning. Method: This paper proposes a perception-driven real-time motion planning framework, centered on a novel polygonal semantic plane mapping mechanism that fuses LiDAR, RGB-D, and IMU data with visual odometry and planar segmentation. Implemented on an NVIDIA Orin platform, the system achieves 20–30 Hz whole-body motion planning. Contribution/Results: The method enables efficient parsing of complex stair geometries and adaptive footstep placement, significantly improving environmental understanding accuracy and gait generation robustness. Extensive experiments across diverse indoor and outdoor staircases demonstrate stable, efficient autonomous stair traversal. The framework provides a scalable, integrated perception–planning solution for bipedal robot navigation in unstructured environments.
📝 Abstract
Recently, biped robot walking technology has been significantly developed, mainly in the context of a bland walking scheme. To emulate human walking, robots need to step on the positions they see in unknown spaces accurately. In this paper, we present PolyMap, a perception-based locomotion planning framework for humanoid robots to climb stairs. Our core idea is to build a real-time polygonal staircase plane semantic map, followed by a footstep planar using these polygonal plane segments. These plane segmentation and visual odometry are done by multi-sensor fusion(LiDAR, RGB-D camera and IMUs). The proposed framework is deployed on a NVIDIA Orin, which performs 20-30 Hz whole-body motion planning output. Both indoor and outdoor real-scene experiments indicate that our method is efficient and robust for humanoid robot stair climbing.