Generating Model Parameters for Controlling: Parameter Diffusion for Controllable Multi-Task Recommendation

📅 2024-10-14
🏛️ arXiv.org
📈 Citations: 4
Influential: 1
📄 PDF
🤖 AI Summary
Commercial recommendation systems must dynamically adapt to shifting task objectives (e.g., accuracy–diversity trade-offs), yet retraining models incurs prohibitive computational overhead, hindering online real-time adaptation. To address this, we propose a parameter-level, training-free, real-time controllable adaptation method. Our approach is the first to construct a controllable diffusion model directly in the parameter space and leverage classifier-free guidance to generate task-specific model parameters zero-shot and on-demand. It enables test-time adaptive inference and model-agnostic deployment. Evaluated on multiple public and real-world industrial datasets, our method significantly enhances multi-objective controllability while maintaining state-of-the-art recommendation performance. Parameter generation takes under one second—over 300× faster than full retraining—demonstrating unprecedented efficiency for dynamic recommendation adaptation.

Technology Category

Application Category

📝 Abstract
Commercial recommender systems face the challenge that task requirements from platforms or users often change dynamically (e.g., varying preferences for accuracy or diversity). Ideally, the model should be re-trained after resetting a new objective function, adapting to these changes in task requirements. However, in practice, the high computational costs associated with retraining make this process impractical for models already deployed to online environments. This raises a new challenging problem: how to efficiently adapt the learning model to different task requirements by controlling model parameters after deployment, without the need for retraining. To address this issue, we propose a novel controllable learning approach via Parameter Diffusion for controllable multi-task Recommendation (PaDiRec), which allows the customization and adaptation of recommendation model parameters to new task requirements without retraining. Specifically, we first obtain the optimized model parameters through adapter tunning based on the feasible task requirements. Then, we utilize the diffusion model as a parameter generator, employing classifier-free guidance in conditional training to learn the distribution of optimized model parameters under various task requirements. Finally, the diffusion model is applied to effectively generate model parameters in a test-time adaptation manner given task requirements. As a model-agnostic approach, PaDiRec can leverage existing recommendation models as backbones to enhance their controllability. Extensive experiments on public datasets and a dataset from a commercial app, indicate that PaDiRec can effectively enhance controllability through efficient model parameter generation. The code is released at https://anonymous.4open.science/r/PaDiRec-DD13.
Problem

Research questions and friction points this paper is trying to address.

Adapting recommendation models to dynamic task requirements without retraining
Generating model parameters efficiently for new task objectives
Enhancing controllability of existing recommendation models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter generation for controllable recommendation
Adapter tuning for optimized model parameters
Classifier-free guidance in conditional training
Chenglei Shen
Chenglei Shen
Gaoling School of Artificial Intelligence, Renmin University of China
Recommender systemsLarge language model
Jiahao Zhao
Jiahao Zhao
Institute of automation, Chinese Academy of Sciences
LLM Alignment
X
Xiao Zhang
Gaoling School of Artificial Intelligence, Renmin University of China
W
Weijie Yu
School of Information Technology and Management, University of International Business and Economics
M
Ming He
AI Lab at Lenovo Research
J
Jianpin Fan
AI Lab at Lenovo Research