Immersive virtual games: winners for deep cognitive assessment

📅 2025-02-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional cognitive assessments rely on highly controlled, brief tasks, limiting their ability to capture intra- and inter-individual variability. To address this, we introduce PixelDOPA—a Unity-based suite of immersive 3D microgames designed to systematically measure processing speed, rule shifting, inhibitory control, and working memory, while synchronously recording behavioral responses, eye movements, and motion trajectories. Our approach innovates by incorporating process-oriented metrics (e.g., oculomotor response timing) and unsupervised trajectory modeling to uncover stable, strategy-specific behavioral patterns. In a clinical sample of 60 participants, PixelDOPA demonstrated strong construct validity and test–retest reliability, comparable to the NIH Toolbox (Pearson’s *r* = 0.50–0.93; ICC = 0.50–0.92), yet with superior ecological validity. Furthermore, behavioral trajectories enabled robust cross-session participant identification, significantly enhancing sensitivity to individual differences and improving overall data quality.

Technology Category

Application Category

📝 Abstract
Studies of human cognition often rely on brief, highly controlled tasks that emphasize group-level effects but poorly capture the rich variability within and between individuals. Here, we present PixelDOPA, a suite of minigames designed to overcome these limitations by embedding classic cognitive task paradigms in an immersive 3D virtual environment with continuous behavior logging. Four minigames explore overlapping constructs such as processing speed, rule shifting, inhibitory control and working memory, comparing against established NIH Toolbox tasks. Across a clinical sample of 60 participants collected outside a controlled laboratory setting, we found significant, large correlations (r = 0.50-0.93) between the PixelDOPA tasks and NIH Toolbox counterparts, despite differences in stimuli and task structures. Process-informed metrics (e.g., gaze-based response times derived from continuous logging) substantially improved both task convergence and data quality. Test-retest analyses revealed high reliability (ICC = 0.50-0.92) for all minigames. Beyond endpoint metrics, movement and gaze trajectories revealed stable, idiosyncratic profiles of gameplay strategy, with unsupervised clustering distinguishing subjects by their navigational and viewing behaviors. These trajectory-based features showed lower within-person variability than between-person variability, facilitating player identification across repeated sessions. Game-based tasks can therefore retain the psychometric rigor of standard cognitive assessments while providing new insights into dynamic behaviors. By leveraging a highly engaging, fully customizable game engine, we show that comprehensive behavioral tracking boosts the power to detect individual differences--offering a path toward cognitive measures that are both robust and ecologically valid, even in less-than-ideal settings for data collection.
Problem

Research questions and friction points this paper is trying to address.

Develops immersive games for cognitive assessment.
Compares game-based metrics with NIH Toolbox.
Enhances individual cognitive variability detection.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Immersive 3D virtual environment
Continuous behavior logging
Process-informed metrics
🔎 Similar Papers
No similar papers found.
D
D. C. Marticorena
Department of Biomedical Engineering, Washington University, 1 Brookings Drive, St. Louis, MO 63130
Zeyu Lu
Zeyu Lu
Shanghai Jiao Tong University
AIGCLarge Language ModelDiffusion Model
C
Chris Wissmann
Department of Biomedical Engineering, Washington University, 1 Brookings Drive, St. Louis, MO 63130
Yash Agarwal
Yash Agarwal
Unknown affiliation
D
David Garrison
Department of Neurology and Pediatrics, Washington University School of Medicine
J
J. Zempel
Department of Neurology and Pediatrics, Washington University School of Medicine
D
Dennis L Barbour
Department of Biomedical Engineering, Washington University, 1 Brookings Drive, St. Louis, MO 63130; Department of Computer Science and Engineering, Washington University, 1 Brookings Drive, St. Louis, MO 63130