🤖 AI Summary
To address safety-critical navigation and efficient human-robot collaboration in service robotics, this paper proposes a human behavior prediction framework integrating edge-aware perception with goal-directed intention inference. Methodologically, it unifies global edge sensing, multi-view human trajectory projection mapping, and goal-associated intention reasoning, while tightly coupling real-time path re-planning and collaborative control for millisecond-scale behavioral anticipation in both navigation and manipulation scenarios. Its key contribution is an end-to-end “perception–prediction–decision–execution” closed loop, overcoming the limitations of conventional single-modality trajectory extrapolation. Experiments in real-world environments demonstrate a 72% reduction in collision rate, a 38% decrease in task completion time for furniture co-placement, and successful end-to-end room layout reconstruction—significantly enhancing operational safety and task efficiency.
📝 Abstract
The anticipation of human behavior is a crucial capability for robots to interact with humans safely and efficiently. We employ a smart edge sensor network to provide global observations along with future predictions and goal information to integrate anticipatory behavior for the control of a mobile manipulation robot. We present approaches to anticipate human behavior in the context of safe navigation and a collaborative mobile manipulation task. First, we anticipate human motion by employing projections of human trajectories from smart edge sensor network observations into the planning map of a mobile robot. Second, we anticipate human intentions in a collaborative furniture-carrying task to achieve a given goal. Our experiments indicate that anticipating human behavior allows for safer navigation and more efficient collaboration. Finally, we showcase an integrated system that anticipates human behavior and collaborates with a human to achieve a target room layout, including the placement of tables and chairs.