SocialEyes: Scaling mobile eye-tracking to multi-person social settings

📅 2024-07-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional eye-tracking methods, constrained by single-participant laboratory paradigms, fail to capture dynamic attentional interactions in real-world group viewing scenarios (e.g., concerts, films). To address this, we propose the first scalable, multi-device mobile eye-tracking system designed for ecological settings, supporting up to 30 participants with millisecond-level temporal synchronization and robust egocentric-to-allocentric gaze projection—mapping individual first-person fixations into a shared coordinate system. We introduce novel collective gaze analytics metrics and visualization paradigms tailored to group attention dynamics. The system integrates a lightweight user interface, a distributed clock synchronization protocol, and a real-time streaming processing architecture. It was successfully deployed in two public events (N=60), achieving synchronization errors <100 ms and significantly improved gaze projection accuracy. This work overcomes key technical bottlenecks in ecological validity and quantitative analysis of social attention, establishing a new methodological foundation for studying collective cognition.

Technology Category

Application Category

📝 Abstract
Eye movements provide a window into human behaviour, attention, and interaction dynamics. Challenges in real-world, multi-person environments have, however, restrained eye-tracking research predominantly to single-person, in-lab settings. We developed a system to stream, record, and analyse synchronised data from multiple mobile eye-tracking devices during collective viewing experiences (e.g., concerts, films, lectures). We implemented lightweight operator interfaces for real-time-monitoring, remote-troubleshooting, and gaze-projection from individual egocentric perspectives to a common coordinate space for shared gaze analysis. We tested the system in a live concert and a film screening with 30 simultaneous viewers during each of two public events (N=60). We observe precise time-synchronisation between devices measured through recorded clock-offsets, and accurate gaze-projection in challenging dynamic scenes. Our novel analysis metrics and visualizations illustrate the potential of collective eye-tracking data for understanding collaborative behaviour and social interaction. This advancement promotes ecological validity in eye-tracking research and paves the way for innovative interactive tools.
Problem

Research questions and friction points this paper is trying to address.

Scaling eye-tracking to multi-person social settings
Real-time monitoring and shared gaze analysis
Understanding collaborative behavior through collective eye-tracking
Innovation

Methods, ideas, or system contributions that make the work stand out.

Synchronized mobile eye-tracking for multi-person settings
Real-time monitoring and remote troubleshooting interfaces
Gaze projection to common coordinate space for analysis
🔎 Similar Papers
No similar papers found.
S
Shreshth Saxena
Dept. Of Psychology, Neuroscience & Behaviour, McMaster University, Canada
A
Areez Visram
Dept. of Computing and Software, McMaster University, Canada
N
Neil Lobo
Dept. of Computing and Software, McMaster University, Canada
Z
Zahid Mirza
Dept. of Computing and Software, McMaster University, Canada
M
Mehak Rafi Khan
Dept. of Computing and Software, McMaster University, Canada
B
Biranugan Pirabaharan
Dept. of Computing and Software, McMaster University, Canada
A
Alexander Nguyen
Dept. of Psychology, Neuroscience & Behaviour, McMaster University, Canada
L
Lauren Fink
Dept. Of Psychology, Neuroscience & Behaviour, McMaster University, Canada