Beyond Scores: Explainable Intelligent Assessment Strengthens Pre-service Teachers'Assessment Literacy

📅 2026-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the limitations of traditional pre-service teacher education in assessment literacy (AL), which often emphasizes theory over practice and relies on digital assessment tools that provide opaque scores, thereby hindering reflection and transfer. To bridge this gap, we developed XIA, an explainable intelligent assessment platform that integrates contrastive and counterfactual explanation mechanisms into AL development for the first time. By combining cognitive diagnostic models with visualization techniques, XIA constructs explanatory scaffolds that connect assessment theory with instructional practice, shifting learners’ focus from score-based judgments to evidence-based reasoning. In a controlled experiment with 21 pre-service teachers, XIA significantly enhanced participants’ assessment awareness, reflective capacity, and self-regulation, effectively reducing assessment errors and fostering evidence-oriented assessment thinking.

Technology Category

Application Category

📝 Abstract
Assessment literacy (AL) is essential for personalized education, yet difficult to cultivate in pre-service teachers. Conventional teacher preparation programs focus on theoretical knowledge, while digital assessment tools commonly provide opaque scores or parameters. These limitations hinder reflection and transfer, leaving AL underdeveloped. We propose XIA, an eXplainable Intelligent Assessment platform that extends statistics-informed support with visualized cognitive diagnostic reasoning, including contrastive and counterfactual explanations. In a pre-post controlled study with 21 pre-service teachers, we combined quantitative tasks and questionnaires with qualitative interviews. The findings offer preliminary evidence that XIA supported reflection, self-regulation, and assessment awareness, and helped reduce assessment errors. Interviews further showed a shift from score-based judgments toward evidence-based reasoning. This work contributes insights into the design of intelligent assessment tools, showing how explanatory scaffolding can bridge assessment theory and classroom practice and support the cultivation of AL in teacher education.
Problem

Research questions and friction points this paper is trying to address.

Assessment Literacy
Pre-service Teachers
Intelligent Assessment
Explainable AI
Teacher Education
Innovation

Methods, ideas, or system contributions that make the work stand out.

Explainable AI
Assessment Literacy
Cognitive Diagnostic Modeling
Visual Explanation
Teacher Education
Y
Yuang Wei
Institute of Artificial Intelligence for Education, East China Normal University, Shanghai, China; Department of Computer Science, National University of Singapore, Singapore, Singapore
Fei Wang
Fei Wang
National University of Singapore
data miningartificial intelligenceAI4Educationcognitive diagnosisinterpretability
Yifan Zhang
Yifan Zhang
MiroMind, National University of Singapore | Google PhD Fellow
Machine LearningGenerative ModelsAIGC
Brian Y. Lim
Brian Y. Lim
Associate Professor, Department of Computer Science, National University of Singapore
Explainable AIHuman-Centered AIHuman-Computer InteractionUbiquitous ComputingMachine
B
Bo Jiang
Institute of Artificial Intelligence for Education, East China Normal University, Shanghai, China