Zero-shot Cross-domain Knowledge Distillation: A Case study on YouTube Music

📅 2026-03-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of training effective recommendation models in low-traffic scenarios, where data sparsity severely limits performance and conventional cross-domain knowledge distillation is hindered by heterogeneity in features, interfaces, and tasks. To overcome these limitations, the authors propose a zero-shot cross-domain knowledge distillation approach that, for the first time, successfully transfers knowledge from a large-scale video recommendation teacher model (YouTube) to a multi-task ranking student model in a real-world, industrial low-traffic music recommendation system (YouTube Music) without requiring any labeled data in the target domain. Both offline evaluations and online experiments demonstrate substantial improvements in recommendation performance, confirming the feasibility and superiority of zero-shot cross-domain distillation in heterogeneous, low-data environments.
📝 Abstract
Knowledge Distillation (KD) has been widely used to improve the quality of latency sensitive models serving live traffic. However, applying KD in production recommender systems with low traffic is challenging: the limited amount of data restricts the teacher model size, and the cost of training a large dedicated teacher may not be justified. Cross-domain KD offers a cost-effective alternative by leveraging a teacher from a data-rich source domain, but introduces unique technical difficulties, as the features, user interfaces, and prediction tasks can significantly differ. We present a case study of using zero-shot cross-domain KD for multi-task ranking models, transferring knowledge from a (100x) large-scale video recommendation platform (YouTube) to a music recommendation application with significantly lower traffic. We share offline and live experiment results and present findings evaluating different KD techniques in this setting across two ranking models on the music app. Our results demonstrate that zero-shot cross-domain KD is a practical and effective approach to improve the performance of ranking models on low traffic surfaces.
Problem

Research questions and friction points this paper is trying to address.

Knowledge Distillation
Cross-domain
Low-traffic Recommender Systems
Zero-shot
Model Transfer
Innovation

Methods, ideas, or system contributions that make the work stand out.

zero-shot
cross-domain knowledge distillation
multi-task ranking
low-traffic recommendation
teacher-student transfer
🔎 Similar Papers
No similar papers found.