🤖 AI Summary
Functional vision assessment lacks ecologically valid, quantifiable paradigms for seated orientation and mobility (O&M). Method: We propose and validate a novel virtual reality–based seated O&M assessment protocol (VR-S-O&M), the first to integrate adaptive illumination control, multi-path obstacle interaction tasks, real-time motion tracking, and multidimensional behavioral logging—including response timing, miss patterns, and first-step latency—to enable dynamic, quantitative functional vision evaluation. Contribution/Results: Ecological validity was established in 42 normally sighted participants; we thereby constructed the first publicly available benchmark dataset of seated O&M behavior. Analyses revealed systematic associations between visual strategies and behavioral performance. This work establishes a standardized, VR-enabled framework for functional vision assessment, advancing evidence-based rehabilitation evaluation and assistive diagnostic tools for low-vision populations.
📝 Abstract
The purpose of this study was to develop and evaluate a novel virtual reality seated orientation and mobility (VR-S-O&M) test protocol designed to assess functional vision. This study aims to provide a dataset of healthy subjects using this protocol and preliminary analyses. We introduced a VR-based O&M test protocol featuring a novel seated displacement method, diverse lighting conditions, and varying course configurations within a virtual environment. Normally sighted participants (N=42) completed the test, which required them to navigate a path and destroy identified obstacles. We assessed basic performance metrics, including time duration, number of missed objects, and time before the first step, under different environmental conditions to verify ecological validity. Additionally, we analyzed participants' behaviors regarding missed objects, demonstrating the potential of integrating behavioral and interactive data for a more precise functional vision assessment. Our VR-S-O&M test protocol, along with the first O&M behavior dataset, presents significant opportunities for developing more refined performance metrics for assessing functional vision and enhancing the quality of life.