GRAB: An LLM-Inspired Sequence-First Click-Through Rate Prediction Modeling Paradigm

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional deep learning-based recommendation models face limitations in performance, efficiency, generalization, and long-sequence modeling. This work proposes GRAB, an end-to-end generative CTR prediction framework inspired by large language models and grounded in a “sequence-first” paradigm for user behavior modeling. Its core innovation is the Causal Action-aware Multi-channel Attention (CamA) mechanism, which effectively captures temporal dynamics and action signals within user behavior sequences while enabling efficient scaling. Experimental results demonstrate that upon full-scale deployment, GRAB achieves a 3.49% increase in CTR and a 3.05% boost in revenue. Moreover, the model’s representational capacity scales nearly linearly with sequence length, highlighting its strong adaptability to extended user histories.

Technology Category

Application Category

📝 Abstract
Traditional Deep Learning Recommendation Models (DLRMs) face increasing bottlenecks in performance and efficiency, often struggling with generalization and long-sequence modeling. Inspired by the scaling success of Large Language Models (LLMs), we propose Generative Ranking for Ads at Baidu (GRAB), an end-to-end generative framework for Click-Through Rate (CTR) prediction. GRAB integrates a novel Causal Action-aware Multi-channel Attention (CamA) mechanism to effectively capture temporal dynamics and specific action signals within user behavior sequences. Full-scale online deployment demonstrates that GRAB significantly outperforms established DLRMs, delivering a 3.05% increase in revenue and a 3.49% rise in CTR. Furthermore, the model demonstrates desirable scaling behavior: its expressive power shows a monotonic and approximately linear improvement as longer interaction sequences are utilized.
Problem

Research questions and friction points this paper is trying to address.

Click-Through Rate Prediction
Deep Learning Recommendation Models
Long-Sequence Modeling
Generalization
User Behavior Sequences
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generative CTR Prediction
Causal Action-aware Multi-channel Attention
Sequence Modeling
Large Language Model-inspired Recommendation
Scalable Recommendation System
🔎 Similar Papers
No similar papers found.
S
Shaopeng Chen
Baidu Inc., Beijing, China
C
Chuyue Xie
Baidu Inc., Beijing, China
Huimin Ren
Huimin Ren
Ph.D. student, Wocester Polytechnic Institute
Urban ComputingSpatio-temporal Data Mining
S
Shaozong Zhang
Baidu Inc., Beijing, China
H
Han Zhang
Baidu Inc., Beijing, China
R
Ruobing Cheng
Baidu Inc., Beijing, China
Zhiqiang Cao
Zhiqiang Cao
Oak Ridge National Lab
Flexible ElectronicsNeutron ScatteringIsotope Effect
Z
Zehao Ju
Baidu Inc., Beijing, China
Yu Gao
Yu Gao
Unknown affiliation
AlgorithmsData structures
Jie Ding
Jie Ding
Associate Professor, University of Minnesota Twin Cities
machine learningstatisticssignal processingdeep learning
X
Xiaodong Chen
Baidu Inc., Beijing, China
X
Xuewu Jiao
Baidu Inc., Beijing, China
S
Shuanglong Li
Baidu Inc., Beijing, China
Liu Lin
Liu Lin
Beijing jiaotong University
computer vision