Rethinking Generative Recommender Tokenizer: Recsys-Native Encoding and Semantic Quantization Beyond LLMs

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the misalignment between semantic ID-based recommendation methods and generative objectives, which leads to weakly coupled collaborative prediction and inefficient sequence modeling. To overcome these limitations, the authors propose ReSID, a framework featuring recommendation-native encoding and quantization mechanisms that enable efficient generative recommendation without reliance on large language models. The core innovations include Field-Aware Masked Autoencoding (FAMAE) to enhance predictive representation learning and Globally Aligned Orthogonal Quantization (GAOQ) to produce low-uncertainty, compact semantic ID sequences. Evaluated across ten benchmark datasets, ReSID consistently outperforms strong baselines by over 10% on average and reduces tokenization costs by up to 122×.

Technology Category

Application Category

📝 Abstract
Semantic ID (SID)-based recommendation is a promising paradigm for scaling sequential recommender systems, but existing methods largely follow a semantic-centric pipeline: item embeddings are learned from foundation models and discretized using generic quantization schemes. This design is misaligned with generative recommendation objectives: semantic embeddings are weakly coupled with collaborative prediction, and generic quantization is inefficient at reducing sequential uncertainty for autoregressive modeling. To address these, we propose ReSID, a recommendation-native, principled SID framework that rethinks representation learning and quantization from the perspective of information preservation and sequential predictability, without relying on LLMs. ReSID consists of two components: (i) Field-Aware Masked Auto-Encoding (FAMAE), which learns predictive-sufficient item representations from structured features, and (ii) Globally Aligned Orthogonal Quantization (GAOQ), which produces compact and predictable SID sequences by jointly reducing semantic ambiguity and prefix-conditional uncertainty. Theoretical analysis and extensive experiments across ten datasets show the effectiveness of ReSID. ReSID consistently outperforms strong sequential and SID-based generative baselines by an average of over 10%, while reducing tokenization cost by up to 122x. Code is available at https://github.com/FuCongResearchSquad/ReSID.
Problem

Research questions and friction points this paper is trying to address.

Semantic ID
generative recommendation
sequential recommender systems
representation learning
quantization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Semantic ID
Generative Recommender
Field-Aware Masked Auto-Encoding
Orthogonal Quantization
Sequential Recommendation
🔎 Similar Papers
No similar papers found.
Y
Yunxiao Liang
Central South University, Changsha, China
Z
Zhongjin Zhang
Central South University, Changsha, China
Yuxuan Zhu
Yuxuan Zhu
PhD student, University of Illinois Urbana-Champaign
Data systemsAI evaluation
K
Kerui Zhang
Shopee Pte. Ltd., Shanghai, China
Z
Zhiluohan Guo
Shopee Pte. Ltd., Shanghai, China
W
Wenhang Zhou
Shopee Pte. Ltd., Shanghai, China
Z
Zonqi Yang
Shopee Pte. Ltd., Shanghai, China
K
Kangle Wu
Shopee Pte. Ltd., Shanghai, China
Y
Yabo Ni
Nanyang Technological University, Singapore, Singapore
A
Anxiang Zeng
Nanyang Technological University, Singapore, Singapore
Cong Fu
Cong Fu
Texas A&M University, Computer Science
Geometric Deep LearningAI for SciencePhysical SimulationsMoleculesQuantum Many-Body Physics
Jianxin Wang
Jianxin Wang
School of Computer Science and Engineering, Central South university
AlgorithmBioinformaticsComputer Network
Jiazhi Xia
Jiazhi Xia
Central South University
Data VisualizationVisual Analytics