OA-NBV: Occlusion-Aware Next-Best-View Planning for Human-Centered Active Perception on Mobile Robots

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of incomplete visual perception of partially occluded individuals in cluttered environments, which significantly hinders human-centric tasks such as search and rescue. To overcome this limitation, the authors propose an occlusion-aware next-best-view (NBV) planning method that, for the first time, integrates a target-centered visibility model into NBV decision-making to explicitly optimize human body completeness in a single observation. The approach formulates a joint scoring mechanism incorporating target scale, occlusion status, and completeness to generate high-visibility candidate viewpoints while respecting robotic motion constraints, thereby closing the perception-planning loop. Experimental results demonstrate that the method achieves over 90% success rates in both simulated and real-world environments, yielding at least an 81% improvement in normalized target area and a minimum 58% increase in keypoint visibility compared to the strongest baseline.

Technology Category

Application Category

📝 Abstract
We naturally step sideways or lean to see around the obstacle when our view is blocked, and recover a more informative observation. Enabling robots to make the same kind of viewpoint choice is critical for human-centered operations, including search, triage, and disaster response, where cluttered environments and partial visibility frequently degrade downstream perception. However, many Next-Best-View (NBV) methods primarily optimize generic exploration or long-horizon coverage, and do not explicitly target the immediate goal of obtaining a single usable observation of a partially occluded person under real motion constraints. We present Occlusion-Aware Next-Best-View Planning for Human-Centered Active Perception on Mobile Robots (OA-NBV), an occlusion-aware NBV pipeline that autonomously selects the next traversable viewpoint to obtain a more complete view of an occluded human. OA-NBV integrates perception and motion planning by scoring candidate viewpoints using a target-centric visibility model that accounts for occlusion, target scale, and target completeness, while restricting candidates to feasible robot poses. OA-NBV achieves over 90% success rate in both simulation and real-world trials, while baseline NBV methods degrade sharply under occlusion. Beyond success rate, OA-NBV improves observation quality: compared to the strongest baseline, it increases normalized target area by at least 81% and keypoint visibility by at least 58% across settings, making it a drop-in view-selection module for diverse human-centered downstream tasks.
Problem

Research questions and friction points this paper is trying to address.

Next-Best-View
occlusion
human-centered perception
mobile robots
active perception
Innovation

Methods, ideas, or system contributions that make the work stand out.

Next-Best-View Planning
Occlusion-Aware Perception
Human-Centered Robotics
Active Perception
Mobile Robot Navigation
🔎 Similar Papers
No similar papers found.
B
Boxun Hu
Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD 21218, USA
C
Chang Chang
Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD 21218, USA
Jiawei Ge
Jiawei Ge
Johns Hopkins University
RoboticsMedical RoboticsAutonomous Robotic SurgeryAutonomous Tumor Resection
M
Man Namgung
Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD 21218, USA
Xiaomin Lin
Xiaomin Lin
Assistant Prof, University of South Florida
AI for goodRobotics for scienceRobotics for good
Axel Krieger
Axel Krieger
Associate Professor, Johns Hopkins University
Medical DevicesRobotics
Tinoosh Mohsenin
Tinoosh Mohsenin
Johns Hopkins University
Energy efficient computing for autonomous systemsmachine learning and digital signal processing