Contrastive Cross-Course Knowledge Tracing via Concept Graph Guided Knowledge Transfer

πŸ“… 2025-05-14
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing knowledge tracing (KT) models are confined to single-course modeling, failing to comprehensively characterize learners’ cross-course knowledge states. To address this limitation, we propose the first zero-shot, large language model (LLM)-driven framework for multi-course KT. First, we employ zero-shot LLM prompting to automatically construct a cross-course concept graph. Second, we design an LLM-to-LM semantic injection mechanism to enhance the representational capacity of graph convolutional networks (GCNs). Third, we introduce a contrastive learning objective to align single-course and cross-course knowledge representations. Evaluated on a multi-course KT benchmark, our method significantly improves prediction accuracy, strengthens cross-course transferability, and enhances both robustness and interpretability of knowledge state representations.

Technology Category

Application Category

πŸ“ Abstract
Knowledge tracing (KT) aims to predict learners' future performance based on historical learning interactions. However, existing KT models predominantly focus on data from a single course, limiting their ability to capture a comprehensive understanding of learners' knowledge states. In this paper, we propose TransKT, a contrastive cross-course knowledge tracing method that leverages concept graph guided knowledge transfer to model the relationships between learning behaviors across different courses, thereby enhancing knowledge state estimation. Specifically, TransKT constructs a cross-course concept graph by leveraging zero-shot Large Language Model (LLM) prompts to establish implicit links between related concepts across different courses. This graph serves as the foundation for knowledge transfer, enabling the model to integrate and enhance the semantic features of learners' interactions across courses. Furthermore, TransKT includes an LLM-to-LM pipeline for incorporating summarized semantic features, which significantly improves the performance of Graph Convolutional Networks (GCNs) used for knowledge transfer. Additionally, TransKT employs a contrastive objective that aligns single-course and cross-course knowledge states, thereby refining the model's ability to provide a more robust and accurate representation of learners' overall knowledge states.
Problem

Research questions and friction points this paper is trying to address.

Enhance knowledge tracing across different courses
Model relationships between learning behaviors cross-course
Improve accuracy of learners' knowledge state estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages concept graph for cross-course knowledge transfer
Uses LLM prompts to construct cross-course concept graph
Employs contrastive objective to align knowledge states