A Pre-trained Zero-shot Sequential Recommendation Framework via Popularity Dynamics

📅 2024-01-03
🏛️ ACM Conference on Recommender Systems
📈 Citations: 5
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the zero-shot cross-domain transfer challenge in sequential recommendation by proposing PrepRec, a novel pretraining framework. It is the first to incorporate dynamic item popularity modeling into sequential recommendation pretraining, learning generalizable and transferable item representations solely from raw interaction sequences—without requiring textual/multimodal side information or explicit cross-domain alignment. The framework comprises a popularity-aware Transformer architecture, unsupervised sequence-level pretraining, and a posterior interpolation fusion mechanism, enabling plug-and-play zero-shot transfer. Evaluated on five real-world datasets, PrepRec achieves state-of-the-art zero-shot transfer performance. When integrated via simple interpolation with existing models, it yields average improvements of +11.8% in Recall@10 and +22.0% in NDCG@10.

Technology Category

Application Category

📝 Abstract
This paper proposes a novel pre-trained framework for zero-shot cross-domain sequential recommendation without auxiliary information. While using auxiliary information (e.g., item descriptions) seems promising for cross-domain transfer, a cross-domain adaptation of sequential recommenders can be challenging when the target domain differs from the source domain—item descriptions are in different languages; metadata modalities (e.g., audio, image, and text) differ across source and target domains. If we can learn universal item representations independent of the domain type (e.g., groceries, movies), we can achieve zero-shot cross-domain transfer without auxiliary information. Our critical insight is that user interaction sequences highlight shifting user preferences via the popularity dynamics of interacted items. We present a pre-trained sequential recommendation framework: PrepRec, which utilizes a novel popularity dynamics-aware transformer architecture. Through extensive experiments on five real-world datasets, we show that PrepRec, without any auxiliary information, can zero-shot adapt to new application domains and achieve competitive performance compared to state-of-the-art sequential recommender models. In addition, we show that PrepRec complements existing sequential recommenders. With a simple post-hoc interpolation, PrepRec improves the performance of existing sequential recommenders on average by 11.8% in Recall@10 and 22% in NDCG@10. We provide an anonymized implementation of PrepRec at https://github.com/CrowdDynamicsLab/preprec.
Problem

Research questions and friction points this paper is trying to address.

Develop pre-trained sequential recommendation framework for zero-shot transfer
Model item popularity dynamics for universal item representations
Improve existing sequential recommenders' performance with simple interpolation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pre-trained sequential recommendation framework
Modeling item popularity dynamics
Zero-shot transfer to new domains
🔎 Similar Papers
No similar papers found.
Junting Wang
Junting Wang
University of Illinois, Urbana-Champaign
Graph Neural NetworkDeep LearningData miningRecommender System
P
Praneet Rathi
University of Illinois at Urbana-Champaign
H
Hari Sundaram
University of Illinois at Urbana-Champaign