Open Scene Graphs for Open-World Object-Goal Navigation

📅 2025-08-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses open-world semantic navigation—enabling robots to locate target objects in previously unseen environments via natural language instructions, without any task-specific training. We propose the Open Scene Graph (OSG) as a spatial memory representation that integrates generative spatial-semantic modeling with foundation model knowledge, enabling zero-shot generalization across environments, object categories, and robotic platforms. OSG organizes spatial-semantic relationships hierarchically via a schema-based structure, eliminating the need for fine-tuning or retraining on new scenes. Our method achieves state-of-the-art performance on the ObjectNav benchmark and is deployed end-to-end in both simulation and on real robots (Fetch and Spot). Experiments demonstrate significant improvements in robustness to unseen environment layouts, novel object categories, and heterogeneous robot morphologies. Results validate the effectiveness and scalability of our open-world semantic navigation system in both simulated and physical settings.

Technology Category

Application Category

📝 Abstract
How can we build general-purpose robot systems for open-world semantic navigation, e.g., searching a novel environment for a target object specified in natural language? To tackle this challenge, we introduce OSG Navigator, a modular system composed of foundation models, for open-world Object-Goal Navigation (ObjectNav). Foundation models provide enormous semantic knowledge about the world, but struggle to organise and maintain spatial information effectively at scale. Key to OSG Navigator is the Open Scene Graph representation, which acts as spatial memory for OSG Navigator. It organises spatial information hierarchically using OSG schemas, which are templates, each describing the common structure of a class of environments. OSG schemas can be automatically generated from simple semantic labels of a given environment, e.g., "home" or "supermarket". They enable OSG Navigator to adapt zero-shot to new environment types. We conducted experiments using both Fetch and Spot robots in simulation and in the real world, showing that OSG Navigator achieves state-of-the-art performance on ObjectNav benchmarks and generalises zero-shot over diverse goals, environments, and robot embodiments.
Problem

Research questions and friction points this paper is trying to address.

Develop general-purpose robot systems for open-world semantic navigation
Organize spatial information effectively using Open Scene Graph representation
Achieve zero-shot adaptation to new environment types
Innovation

Methods, ideas, or system contributions that make the work stand out.

Open Scene Graph for hierarchical spatial memory
OSG schemas enable zero-shot environment adaptation
Modular system integrates foundation models effectively
🔎 Similar Papers
No similar papers found.