🤖 AI Summary
This study addresses the longstanding trade-off between ecological validity and experimental control in driving simulation research. Conventional high-fidelity physical simulators are costly and inflexible, whereas virtual reality systems often lack authentic vehicle interaction. To bridge this gap, the authors propose and implement an open-source mixed-reality driving simulator that integrates a real automotive cockpit with a programmable virtual driving environment. This hybrid system preserves the physical fidelity of driver–vehicle interaction while enabling highly controlled experimental manipulations. The platform incorporates eye-tracking, touch-interaction logging, and high-fidelity scene rendering. Its utility and research value have been demonstrated through a pilot study in autonomous driving scenarios, successfully supporting analyses of driver attention allocation and human–machine interaction.
📝 Abstract
Designing and evaluating in-vehicle interfaces requires experimental platforms that combine ecological validity with experimental control. Driving simulators are widely used for this purpose. However, they face a fundamental trade-off: high-fidelity physical simulators are costly and difficult to adapt, while virtual reality simulators provide flexibility at the expense of physical interaction with the vehicle. In this work, we present MRDrive, an open mixed-reality driving simulator designed to support HCI research on in-vehicle interaction, attention, and explainability in manual and automated driving contexts. MRDrive enables drivers and passengers to interact with a real vehicle cabin while being fully immersed in a virtual driving environment. We demonstrate the capabilities of MRDrive through a small pilot study that illustrates how the simulator can be used to collect and analyze eye-tracking and touch interaction data in an automated driving scenario. MRDRive is available at: https://github.com/ciao-group/mrdrive