MetaCD: A Meta Learning Framework for Cognitive Diagnosis based on Continual Learning

📅 2025-12-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Cognitive diagnosis in intelligent education suffers from weak generalization and poor adaptability due to long-tailed data distributions and concept drift. To address these challenges, this paper proposes the first cognitive diagnosis framework integrating meta-learning and continual learning. It employs a MAML-inspired meta-initialization to enhance few-shot skill modeling, introduces a parameter-protection mechanism to balance cross-task stability and task-specific plasticity, and establishes a fine-grained student–item–skill interaction model. Extensive experiments on five real-world educational datasets demonstrate that our method achieves state-of-the-art performance in both diagnostic accuracy and cross-task generalization. It significantly mitigates long-tail bias and concept drift effects, offering a robust paradigm for cognitive assessment in dynamic educational environments.

Technology Category

Application Category

📝 Abstract
Cognitive diagnosis is an essential research topic in intelligent education, aimed at assessing the level of mastery of different skills by students. So far, many research works have used deep learning models to explore the complex interactions between students, questions, and skills. However, the performance of existing method is frequently limited by the long-tailed distribution and dynamic changes in the data. To address these challenges, we propose a meta-learning framework for cognitive diagnosis based on continual learning (MetaCD). This framework can alleviate the long-tailed problem by utilizing meta-learning to learn the optimal initialization state, enabling the model to achieve good accuracy on new tasks with only a small amount of data. In addition, we utilize a continual learning method named parameter protection mechanism to give MetaCD the ability to adapt to new skills or new tasks, in order to adapt to dynamic changes in data. MetaCD can not only improve the plasticity of our model on a single task, but also ensure the stability and generalization of the model on sequential tasks. Comprehensive experiments on five real-world datasets show that MetaCD outperforms other baselines in both accuracy and generalization.
Problem

Research questions and friction points this paper is trying to address.

Addresses long-tailed data distribution in cognitive diagnosis
Adapts to dynamic changes in new skills or tasks
Enhances model plasticity, stability, and generalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Meta-learning optimizes initialization for long-tailed data
Continual learning adapts to dynamic skill and task changes
Parameter protection ensures model stability and generalization
🔎 Similar Papers
No similar papers found.
J
Jin Wu
Shanghai Institute of Artificial Intelligence for Education, East China Normal University
Chanjin Zheng
Chanjin Zheng
East China Normal University
educational measurementpsychometricsapplied statistics