Taming the One-Epoch Phenomenon in Online Recommendation System by Two-stage Contrastive ID Pre-training

📅 2025-08-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
ID embeddings in online recommendation systems suffer from overfitting due to severe long-tail item distributions, leading to the “one-epoch problem”—a hard constraint limiting training to a single pass over data and severely degrading model performance. To address this, we propose the first two-stage contrastive pretraining framework specifically designed for ID embeddings: Stage I performs multi-epoch contrastive learning on a lightweight surrogate model to mitigate long-tail bias and improve generalization; Stage II fine-tunes the learned embeddings via online knowledge distillation to downstream tasks. This design breaks the one-epoch constraint, enabling stable multi-epoch training without degradation while supporting efficient transfer. Offline experiments confirm the absence of overfitting; online deployment at Pinterest yielded significant gains in site-wide user engagement. Our core contribution is the novel integration of two-stage contrastive learning into ID embedding pretraining—establishing a new paradigm for embedding optimization under long-tail data regimes.

Technology Category

Application Category

📝 Abstract
ID-based embeddings are widely used in web-scale online recommendation systems. However, their susceptibility to overfitting, particularly due to the long-tail nature of data distributions, often limits training to a single epoch, a phenomenon known as the "one-epoch problem." This challenge has driven research efforts to optimize performance within the first epoch by enhancing convergence speed or feature sparsity. In this study, we introduce a novel two-stage training strategy that incorporates a pre-training phase using a minimal model with contrastive loss, enabling broader data coverage for the embedding system. Our offline experiments demonstrate that multi-epoch training during the pre-training phase does not lead to overfitting, and the resulting embeddings improve online generalization when fine-tuned for more complex downstream recommendation tasks. We deployed the proposed system in live traffic at Pinterest, achieving significant site-wide engagement gains.
Problem

Research questions and friction points this paper is trying to address.

Addresses overfitting in ID-based recommendation embeddings
Mitigates one-epoch training limitation from long-tail data
Enables multi-epoch pre-training without performance degradation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-stage contrastive ID pre-training strategy
Minimal model with contrastive loss pre-training
Multi-epoch pre-training prevents embedding overfitting
🔎 Similar Papers
No similar papers found.