Linear Item-Item Model with Neural Knowledge for Session-based Recommendation

📅 2025-04-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the paradigm gap between neural models and linear item-item models in session-based recommendation (SBR), this paper proposes, for the first time, knowledge distillation of complex sequential transition patterns from neural sequence models into a lightweight linear framework. Our method introduces two core innovations: (1) a self-distillation-enhanced Linear Item Similarity (LIS) module to improve robustness to co-occurrence patterns; and (2) a Neural Knowledge Injection for Transition modeling (NIT) mechanism that encodes implicit sequential dependencies—learned by neural models—into interpretable, linear item-item weights. Evaluated on six real-world datasets, the unified linear framework significantly outperforms existing linear SBR methods: Recall@20 improves by up to 14.78%, MRR@20 by 11.04%, while inference FLOPs drop to only 1/813 of those required by the teacher neural model—achieving both high accuracy and remarkable efficiency.

Technology Category

Application Category

📝 Abstract
Session-based recommendation (SBR) aims to predict users' subsequent actions by modeling short-term interactions within sessions. Existing neural models primarily focus on capturing complex dependencies for sequential item transitions. As an alternative solution, linear item-item models mainly identify strong co-occurrence patterns across items and support faster inference speed. Although each paradigm has been actively studied in SBR, their fundamental differences in capturing item relationships and how to bridge these distinct modeling paradigms effectively remain unexplored. In this paper, we propose a novel SBR model, namely Linear Item-Item model with Neural Knowledge (LINK), which integrates both types of knowledge into a unified linear framework. Specifically, we design two specialized components of LINK: (i) Linear knowledge-enhanced Item-item Similarity model (LIS), which refines the item similarity correlation via self-distillation, and (ii) Neural knowledge-enhanced Item-item Transition model (NIT), which seamlessly incorporates complicated neural knowledge distilled from the off-the-shelf neural model. Extensive experiments demonstrate that LINK outperforms state-of-the-art linear SBR models across six real-world datasets, achieving improvements of up to 14.78% and 11.04% in Recall@20 and MRR@20 while showing up to 813x fewer inference FLOPs. Our code is available at https://github.com/jin530/LINK.
Problem

Research questions and friction points this paper is trying to address.

Bridging linear and neural models for session-based recommendation
Enhancing item similarity with self-distillation in linear models
Incorporating neural knowledge into linear item-item transition models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates linear and neural knowledge models
Uses self-distillation for item similarity refinement
Achieves high efficiency with low inference FLOPs
🔎 Similar Papers
No similar papers found.