🤖 AI Summary
Existing simplified models for legged robots exhibit low whole-body motion prediction accuracy in complex environments and fail to capture closed-loop dynamics and limb-level collisions. Method: We propose a learning-based observation–prediction framework comprising (i) a neural observer with uniformly ultimately bounded (UUB) stability guarantees for latent state estimation from proprioceptive data; (ii) an efficient latent-dynamics predictor enabling large-scale parallel trajectory evaluation and stable initialization; and (iii) integration with an MPPI sampling-based planner for limb-level obstacle avoidance and autonomous navigation. Contribution/Results: This work is the first to tightly integrate learning-based observation with real-time planning under theoretical stability guarantees. Experimental validation on the Vision 60 robot demonstrates successful narrow-corridor traversal and small-obstacle negotiation, significantly improving robustness and real-time performance of collision-aware motion planning.
📝 Abstract
Accurate full-body motion prediction is essential for the safe, autonomous navigation of legged robots, enabling critical capabilities like limb-level collision checking in cluttered environments. Simplified kinematic models often fail to capture the complex, closed-loop dynamics of the robot and its low-level controller, limiting their predictions to simple planar motion. To address this, we present a learning-based observer-predictor framework that accurately predicts this motion. Our method features a neural observer with provable UUB guarantees that provides a reliable latent state estimate from a history of proprioceptive measurements. This stable estimate initializes a computationally efficient predictor, designed for the rapid, parallel evaluation of thousands of potential trajectories required by modern sampling-based planners. We validated the system by integrating our neural predictor into an MPPI-based planner on a Vision 60 quadruped. Hardware experiments successfully demonstrated effective, limb-aware motion planning in a challenging, narrow passage and over small objects, highlighting our system's ability to provide a robust foundation for high-performance, collision-aware planning on dynamic robotic platforms.