🤖 AI Summary
This study addresses the challenge of interpreting and applying eye-tracking data in English Language Arts (ELA) instruction. We designed and implemented a teacher-centered gaze analytics dashboard that integrates user-centered design and data storytelling principles, employing hierarchical visualizations and narrative scaffolding to render eye-movement data pedagogically meaningful. The dashboard incorporates a large language model (LLM)-driven conversational AI agent enabling natural-language interaction and multimodal learning analytics. Our key contribution is the first translation of raw eye-tracking metrics into actionable, narrative-driven instructional insights—significantly reducing teachers’ cognitive load through LLM-mediated interpretation. Empirical evaluation demonstrates that the tool substantially improves teachers’ efficiency in inferring students’ cognitive states and classroom engagement, alleviates the burden of data interpretation, and enhances the quality of formative assessment and pedagogical decision-making—thereby validating the feasibility and educational value of gaze analytics in authentic teaching contexts.
📝 Abstract
Eye-tracking offers rich insights into student cognition and engagement, but remains underutilized in classroom-facing educational technology due to challenges in data interpretation and accessibility. In this paper, we present the iterative design and evaluation of a gaze-based learning analytics dashboard for English Language Arts (ELA), developed through five studies involving teachers and students. Guided by user-centered design and data storytelling principles, we explored how gaze data can support reflection, formative assessment, and instructional decision-making. Our findings demonstrate that gaze analytics can be approachable and pedagogically valuable when supported by familiar visualizations, layered explanations, and narrative scaffolds. We further show how a conversational agent, powered by a large language model (LLM), can lower cognitive barriers to interpreting gaze data by enabling natural language interactions with multimodal learning analytics. We conclude with design implications for future EdTech systems that aim to integrate novel data modalities in classroom contexts.