🤖 AI Summary
Humanoid robots deployed in real-world scenarios must simultaneously achieve high-fidelity motion imitation and robust environmental adaptability—yet existing approaches trade off one for the other: optimization-based methods generalize well but suffer from low motion fidelity, while end-to-end imitation learning achieves high accuracy yet relies heavily on extensive motion datasets and precise reference trajectories. To bridge this gap, we propose AdaMimic—a novel framework enabling adaptive motion tracking from a single reference keyframe. Its core innovations include sparse keyframe generation, policy initialization for initial trajectory tracking, and a trainable lightweight adapter for fine-grained temporal warping and low-level motion refinement. Evaluated in simulation and on a physical Unitree G1 robot, AdaMimic significantly improves cross-environment imitation accuracy and robustness, achieving high-fidelity, environment-adaptive real-time motion control under minimal observational input.
📝 Abstract
Humanoid robots are envisioned to adapt demonstrated motions to diverse real-world conditions while accurately preserving motion patterns. Existing motion prior approaches enable well adaptability with a few motions but often sacrifice imitation accuracy, whereas motion-tracking methods achieve accurate imitation yet require many training motions and a test-time target motion to adapt. To combine their strengths, we introduce AdaMimic, a novel motion tracking algorithm that enables adaptable humanoid control from a single reference motion. To reduce data dependence while ensuring adaptability, our method first creates an augmented dataset by sparsifying the single reference motion into keyframes and applying light editing with minimal physical assumptions. A policy is then initialized by tracking these sparse keyframes to generate dense intermediate motions, and adapters are subsequently trained to adjust tracking speed and refine low-level actions based on the adjustment, enabling flexible time warping that further improves imitation accuracy and adaptability. We validate these significant improvements in our approach in both simulation and the real-world Unitree G1 humanoid robot in multiple tasks across a wide range of adaptation conditions. Videos and code are available at https://taohuang13.github.io/adamimic.github.io/.