Orthogonal Low Rank Embedding Stabilization

📅 2025-08-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In recommender systems, model retraining induces embedding space drift—degrading stability of downstream tasks such as similarity search and cold-start inference. This work proposes a lightweight, lossless embedding stabilization method that maps embeddings from successive training rounds into a unified, standardized space without altering the model architecture or training objective. Specifically, it applies an orthogonal low-rank transformation, combining efficient truncated SVD with orthogonal Procrustes alignment. The method strictly preserves dot-product invariance and inference efficiency, supports reversible alignment, and is plug-and-play compatible. Experiments demonstrate that it substantially suppresses embedding fluctuations—reducing average drift by 72%—while significantly improving downstream task stability. Crucially, it fully retains the original model’s prediction accuracy, ensuring no compromise on recommendation quality.

Technology Category

Application Category

📝 Abstract
The instability of embedding spaces across model retraining cycles presents significant challenges to downstream applications using user or item embeddings derived from recommendation systems as input features. This paper introduces a novel orthogonal low-rank transformation methodology designed to stabilize the user/item embedding space, ensuring consistent embedding dimensions across retraining sessions. Our approach leverages a combination of efficient low-rank singular value decomposition and orthogonal Procrustes transformation to map embeddings into a standardized space. This transformation is computationally efficient, lossless, and lightweight, preserving the dot product and inference quality while reducing operational burdens. Unlike existing methods that modify training objectives or embedding structures, our approach maintains the integrity of the primary model application and can be seamlessly integrated with other stabilization techniques.
Problem

Research questions and friction points this paper is trying to address.

Stabilize embedding spaces across model retraining cycles
Ensure consistent embedding dimensions during retraining
Preserve dot product and inference quality efficiently
Innovation

Methods, ideas, or system contributions that make the work stand out.

Orthogonal low-rank transformation stabilizes embeddings
Uses SVD and Procrustes for standardized mapping
Preserves dot product and inference quality
🔎 Similar Papers
K
Kevin Zielnicki
Netflix, Los Gatos, California, USA
Ko-Jen Hsiao
Ko-Jen Hsiao
Data Scientist, WhisperText Inc.