🤖 AI Summary
Existing control methods for humanoid robots struggle to simultaneously achieve adaptability to complex terrains and execution of highly dynamic whole-body motions. This work proposes a unified reinforcement learning framework that tightly integrates environmental perception with whole-body dynamic motion control. For the first time, external perceptual inputs are deeply embedded into multi-contact motion generation, synergistically combining full-body dynamics modeling with multi-contact motion planning. The resulting single policy enables agile maneuvers such as vaulting and rolling, and demonstrates significantly enhanced robustness and generalization on unstructured terrain compared to conventional walking or running controllers.
📝 Abstract
Current approaches to humanoid control generally fall into two paradigms: perceptive locomotion, which handles terrain well but is limited to pedal gaits, and general motion tracking, which reproduces complex skills but ignores environmental capabilities. This work unites these paradigms to achieve perceptive general motion control. We present a framework where exteroceptive sensing is integrated into whole-body motion tracking, permitting a humanoid to perform highly dynamic, non-locomotion tasks on uneven terrain. By training a single policy to perform multiple distinct motions across varied terrestrial features, we demonstrate the non-trivial benefit of integrating perception into the control loop. Our results show that this framework enables robust, highly dynamic multi-contact motions, such as vaulting and dive-rolling, on unstructured terrain, significantly expanding the robot's traversability beyond simple walking or running. https://project-instinct.github.io/deep-whole-body-parkour