🤖 AI Summary
Traditional eye-tracking methods, constrained by single-participant laboratory paradigms, fail to capture dynamic attentional interactions in real-world group viewing scenarios (e.g., concerts, films). To address this, we propose the first scalable, multi-device mobile eye-tracking system designed for ecological settings, supporting up to 30 participants with millisecond-level temporal synchronization and robust egocentric-to-allocentric gaze projection—mapping individual first-person fixations into a shared coordinate system. We introduce novel collective gaze analytics metrics and visualization paradigms tailored to group attention dynamics. The system integrates a lightweight user interface, a distributed clock synchronization protocol, and a real-time streaming processing architecture. It was successfully deployed in two public events (N=60), achieving synchronization errors <100 ms and significantly improved gaze projection accuracy. This work overcomes key technical bottlenecks in ecological validity and quantitative analysis of social attention, establishing a new methodological foundation for studying collective cognition.
📝 Abstract
Eye movements provide a window into human behaviour, attention, and interaction dynamics. Challenges in real-world, multi-person environments have, however, restrained eye-tracking research predominantly to single-person, in-lab settings. We developed a system to stream, record, and analyse synchronised data from multiple mobile eye-tracking devices during collective viewing experiences (e.g., concerts, films, lectures). We implemented lightweight operator interfaces for real-time-monitoring, remote-troubleshooting, and gaze-projection from individual egocentric perspectives to a common coordinate space for shared gaze analysis. We tested the system in a live concert and a film screening with 30 simultaneous viewers during each of two public events (N=60). We observe precise time-synchronisation between devices measured through recorded clock-offsets, and accurate gaze-projection in challenging dynamic scenes. Our novel analysis metrics and visualizations illustrate the potential of collective eye-tracking data for understanding collaborative behaviour and social interaction. This advancement promotes ecological validity in eye-tracking research and paves the way for innovative interactive tools.