Unconditional Diffusion for Generative Sequential Recommendation

📅 2025-07-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing conditional diffusion models for sequential recommendation weaken user history utilization by modeling the “item ↔ noise” transformation, treating historical sequences as auxiliary conditions rather than central structural anchors. Method: We propose an unconditional Brownian bridge diffusion model that designates the user’s historical behavior sequence as the deterministic terminal point of the diffusion process—rather than a conditional input—thereby focusing exclusively on modeling the evolutionary relationships among items. By constructing bidirectional Brownian bridge paths anchored at the observed history, we jointly constrain noise injection and denoising, eliminating interference from conventional conditional modeling. The model employs an end-to-end denoising architecture to directly generate personalized sequences from pure noise. Contribution/Results: Our approach achieves significant improvements over state-of-the-art methods across multiple public benchmarks, empirically validating the efficacy of history-centric diffusion modeling. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Diffusion models, known for their generative ability to simulate data creation through noise-adding and denoising processes, have emerged as a promising approach for building generative recommenders. To incorporate user history for personalization, existing methods typically adopt a conditional diffusion framework, where the reverse denoising process of reconstructing items from noise is modified to be conditioned on the user history. However, this design may fail to fully utilize historical information, as it gets distracted by the need to model the "item $leftrightarrow$ noise" translation. This motivates us to reformulate the diffusion process for sequential recommendation in an unconditional manner, treating user history (instead of noise) as the endpoint of the forward diffusion process (i.e., the starting point of the reverse process), rather than as a conditional input. This formulation allows for exclusive focus on modeling the "item $leftrightarrow$ history" translation. To this end, we introduce Brownian Bridge Diffusion Recommendation (BBDRec). By leveraging a Brownian bridge process, BBDRec enforces a structured noise addition and denoising mechanism, ensuring that the trajectories are constrained towards a specific endpoint -- user history, rather than noise. Extensive experiments demonstrate BBDRec's effectiveness in enhancing sequential recommendation performance. The source code is available at https://github.com/baiyimeng/BBDRec.
Problem

Research questions and friction points this paper is trying to address.

Reformulate diffusion process for sequential recommendation unconditionally
Enhance user history utilization in generative recommenders
Improve item-history translation modeling via Brownian bridge
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unconditional diffusion process using user history
Brownian bridge for structured noise addition
Focus on item-history translation exclusively
🔎 Similar Papers
No similar papers found.
Yimeng Bai
Yimeng Bai
University of Science and Technology of China
RecommendationGenerative RecommendationLarge Language Model
Y
Yang Zhang
National University of Singapore
Sihao Ding
Sihao Ding
Mercedes-Benz R&D North America
Computer VisionMachine Learning
S
Shaohui Ruan
ByteDance China
H
Han Yao
ByteDance China
D
Danhui Guan
ByteDance China
F
Fuli Feng
University of Science and Technology of China
T
Tat-Seng Chua
National University of Singapore