Reading Decisions from Gaze Direction during Graphics Turing Test of Gait Animation

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the role of gaze direction in discriminating genuine from synthetic gait. Using a two-alternative forced-choice (2AFC) paradigm, participants judged whether gait videos were captured from real human motion or synthesized using motion primitives (MPs); concurrent eye-tracking recorded gaze trajectories. Information-theoretic analysis revealed that oculomotor features—particularly initial saccade direction—significantly predicted participants’ subjective judgments but bore no relationship to objective ground-truth labels. These findings provide the first empirical evidence that gaze patterns encode subjective decision signals rather than stimulus veracity, establishing a quantifiable implicit behavioral biomarker for the graphics Turing test. By integrating eye-tracking, MP-based modeling, 3D animation generation, and shared information quantification, the work enhances both the validity of decision process assessment and the mechanistic interpretability of motion perception.

Technology Category

Application Category

📝 Abstract
We investigated gaze direction during movement observation. The eye movement data were collected during an experiment, in which different models of movement production (based on movement primitives, MPs) were compared in a two alternatives forced choice task (2AFC). Participants observed side-by-side presentation of two naturalistic 3D-rendered human movement videos, where one video was based on motion captured gait sequence, the other one was generated by recombining the machine-learned MPs to approximate the same movement. The task was to discriminate between these movements while their eye movements were recorded. We are complementing previous binary decision data analyses with eye tracking data. Here, we are investigating the role of gaze direction during task execution. We computed the shared information between gaze features and decisions of the participants, and between gaze features and correct answers. We found that eye movements reflect the decision of participants during the 2AFC task, but not the correct answer. This result is important for future experiments, which should take advantage of eye tracking to complement binary decision data.
Problem

Research questions and friction points this paper is trying to address.

Analyze gaze direction during movement observation tasks
Compare eye movement data with binary decision outcomes
Investigate gaze-decision correlation in gait animation discrimination
Innovation

Methods, ideas, or system contributions that make the work stand out.

Eye tracking complements binary decision data
Gaze direction reflects participant decisions
Shared information between gaze and decisions
🔎 Similar Papers
No similar papers found.