Moving Through Clutter: Scaling Data Collection and Benchmarking for 3D Scene-Aware Humanoid Locomotion via Virtual Reality

📅 2026-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current humanoid robots lack the capability for whole-body coordination and balance control grounded in scene perception within dense, geometrically constrained 3D cluttered environments, and there is a notable absence of publicly available datasets that couple human motion with scene geometry. This work proposes the first open-source, virtual reality–based data collection and evaluation framework: leveraging procedurally generated 3D scenes with controllable clutter levels, immersive VR full-body motion capture, and automatic retargeting techniques, it constructs a large-scale dataset comprising 145 scenes and 348 motion trajectories. For the first time, this project systematically integrates physically plausible, embodied human motion with complex scene geometry and introduces quantitative benchmarks for environmental clutter and motion performance—such as stability and collision safety—thereby providing critical support for geometry-driven adaptive locomotion research in humanoid robotics.

Technology Category

Application Category

📝 Abstract
Recent advances in humanoid locomotion have enabled dynamic behaviors such as dancing, martial arts, and parkour, yet these capabilities are predominantly demonstrated in open, flat, and obstacle-free settings. In contrast, real-world environments such as homes, offices, and public spaces, are densely cluttered, three-dimensional, and geometrically constrained, requiring scene-aware whole-body coordination, precise balance control, and reasoning over spatial constraints imposed by furniture and household objects. However, humanoid locomotion in cluttered 3D environments remains underexplored, and no public dataset systematically couples full-body human locomotion with the scene geometry that shapes it. To address this gap, we present Moving Through Clutter (MTC), an opensource Virtual Reality (VR) based data collection and evaluation framework for scene-aware humanoid locomotion in cluttered environments. Our system procedurally generates scenes with controllable clutter levels and captures embodiment-consistent, whole-body human motion through immersive VR navigation, which is then automatically retargeted to a humanoid robot model. We further introduce benchmarks that quantify environment clutter level and locomotion performance, including stability and collision safety. Using this framework, we compile a dataset of 348 trajectories across 145 diverse 3D cluttered scenes. The dataset provides a foundation for studying geometry-induced adaptation in humanoid locomotion and developing scene-aware planning and control methods.
Problem

Research questions and friction points this paper is trying to address.

humanoid locomotion
cluttered environments
scene-aware navigation
3D scene geometry
whole-body motion
Innovation

Methods, ideas, or system contributions that make the work stand out.

scene-aware locomotion
virtual reality data collection
humanoid robotics
cluttered 3D environments
motion retargeting
🔎 Similar Papers
No similar papers found.
Beichen Wang
Beichen Wang
PhD Candidate at Wageningen University & Research
Natural Language ProcessingInformation RetrievalComplex Network
Y
Yuanjie Lu
George Mason University, 4400 University Dr, Fairfax, VA 22030, USA
L
Linji Wang
George Mason University, 4400 University Dr, Fairfax, VA 22030, USA
L
Liuchuan Yu
George Mason University, 4400 University Dr, Fairfax, VA 22030, USA
X
Xuesu Xiao
George Mason University, 4400 University Dr, Fairfax, VA 22030, USA