DriveSimQuest: A VR Driving Simulator and Research Platform on Meta Quest with Unity

📅 2025-08-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing VR driving simulators are largely limited to eye-tracking, rely on bulky external hardware, and are built primarily on Unreal Engine, hindering interactive human–vehicle research. This paper introduces a lightweight, multimodal driving simulation platform based on the Meta Quest Pro and Unity—enabling, for the first time on consumer-grade VR headsets, synchronized real-time acquisition of gaze, facial expressions, hand gestures, and full-body pose. By deeply integrating Quest Pro’s inside-out tracking, eye-tracking, and facial expression SDKs—and embedding real-time motion capture and behavioral analysis modules—the platform substantially lowers technical barriers. It supports driver affective state recognition and context-aware assistance system design. The platform has been successfully deployed in multiple human–vehicle interaction and driving behavior studies, demonstrating high deployability, experimental reproducibility, and scalability for multimodal behavioral research.

Technology Category

Application Category

📝 Abstract
Using head-mounted Virtual Reality (VR) displays to simulate driving is critical to studying driving behavior and designing driver assistance systems. But existing VR driving simulators are often limited to tracking only eye movements. The bulky outside-in tracking setup and Unreal-based architecture also present significant engineering challenges for interaction researchers and practitioners. We present DriveSimQuest, a VR driving simulator and research platform built on the Meta Quest Pro and Unity, capable of capturing rich behavioral signals such as gaze, facial expressions, hand activities, and full-body gestures in real-time. DriveSimQuest offers a preliminary, easy-to-deploy platform that supports researchers and practitioners in studying drivers' affective states and behaviors, and in designing future context-aware driving assistance systems.
Problem

Research questions and friction points this paper is trying to address.

Limited VR driving simulators track only eye movements
Bulky outside-in tracking setup hinders interaction research
Need for real-time capture of diverse driver behaviors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Meta Quest Pro for VR driving simulation
Tracks gaze, facial expressions, and gestures
Built with Unity for easy deployment
🔎 Similar Papers
No similar papers found.