Integrating Field of View in Human-Aware Collaborative Planning

📅 2025-05-20
📈 Citations: 1
Influential: 1
📄 PDF
🤖 AI Summary
In human–robot collaboration (HRC), limited human field of view (FOV) leads to incomplete perception, yet existing methods often assume full human observability. Method: We propose the first probabilistic collaborative planning framework that explicitly models human FOV as a perceptual constraint. Our approach employs a hierarchical online planner that efficiently generates robot trajectories—confined within the human’s FOV—that actively elicit and align with human intent in large state spaces. Contribution/Results: This work is the first to integrate FOV as an explicit, geometrically grounded constraint into collaborative planning, jointly optimizing probabilistic intent inference and real-time trajectory generation. Evaluated on real-world cooking tasks and VR kitchen experiments, our framework significantly reduces human interruptions and redundant actions, demonstrating improved collaboration naturalness, efficiency, and robustness under partial observability.

Technology Category

Application Category

📝 Abstract
In human-robot collaboration (HRC), it is crucial for robot agents to consider humans' knowledge of their surroundings. In reality, humans possess a narrow field of view (FOV), limiting their perception. However, research on HRC often overlooks this aspect and presumes an omniscient human collaborator. Our study addresses the challenge of adapting to the evolving subtask intent of humans while accounting for their limited FOV. We integrate FOV within the human-aware probabilistic planning framework. To account for large state spaces due to considering FOV, we propose a hierarchical online planner that efficiently finds approximate solutions while enabling the robot to explore low-level action trajectories that enter the human FOV, influencing their intended subtask. Through user study with our adapted cooking domain, we demonstrate our FOV-aware planner reduces human's interruptions and redundant actions during collaboration by adapting to human perception limitations. We extend these findings to a virtual reality kitchen environment, where we observe similar collaborative behaviors.
Problem

Research questions and friction points this paper is trying to address.

Adapting to human subtask intent with limited field of view
Integrating FOV in human-aware probabilistic planning for HRC
Reducing interruptions by accounting for human perception constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates FOV in human-aware probabilistic planning
Proposes hierarchical online planner for large state spaces
Adapts to human perception limitations in collaboration
🔎 Similar Papers
Y
Ya-Chuan Hsu
Thomas Lord Department of Computer Science, University of Southern California, Los Angeles, CA 90089, USA
M
Michael Defranco
Thomas Lord Department of Computer Science, University of Southern California, Los Angeles, CA 90089, USA
Rutvik Patel
Rutvik Patel
University of Southern California
RoboticsRobot LearningRobotics ManufacturingField Robotics
Stefanos Nikolaidis
Stefanos Nikolaidis
Associate Professor of Computer Science, University of Southern California
roboticsartificial intelligencemachine learning