🤖 AI Summary
Undergraduate students in the U.S. face significant course selection challenges due to information asymmetry, insufficient academic advising, and overwhelming curricular workloads.
Method: This paper proposes a skill-driven, interpretable course recommendation approach. Leveraging deep learning, it automatically extracts academic concepts from course descriptions to construct a course–skill bipartite graph. A skill-tag-based explanation mechanism is designed to articulate *unexpectedness*—i.e., recommending courses that meaningfully broaden students’ skill profiles—implemented within the AskOski system as an explainable recommendation framework.
Contribution/Results: To our knowledge, this is the first empirical validation of skill-grounded explanations in a real-world educational setting. Results demonstrate that skill-based explanations significantly increase students’ interest, enrollment intention, and decision confidence toward high-unexpectedness courses, outperforming conventional recommendation baselines. The core contribution lies in integrating cognitive interpretability into educational recommender systems by anchoring explanations in domain-relevant skills—thereby enhancing transparency, pedagogical relevance, and user trust.
📝 Abstract
Academic choice is crucial in U.S. undergraduate education, allowing students significant freedom in course selection. However, navigating the complex academic environment is challenging due to limited information, guidance, and an overwhelming number of choices, compounded by time restrictions and the high demand for popular courses. Although career counselors exist, their numbers are insufficient, and course recommendation systems, though personalized, often lack insight into student perceptions and explanations to assess course relevance. In this paper, a deep learning-based concept extraction model is developed to efficiently extract relevant concepts from course descriptions to improve the recommendation process. Using this model, the study examines the effects of skill-based explanations within a serendipitous recommendation framework, tested through the AskOski system at the University of California, Berkeley. The findings indicate that these explanations not only increase user interest, particularly in courses with high unexpectedness, but also bolster decision-making confidence. This underscores the importance of integrating skill-related data and explanations into educational recommendation systems.