Macro Graph of Experts for Billion-Scale Multi-Task Recommendation

📅 2025-06-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In billion-scale multi-task recommendation, existing methods suffer from performance degradation due to heterogeneous graph structures across tasks and neglect of macro-level graph topology. Method: This paper proposes the first multi-task learning framework incorporating macro-graph structural embeddings. It introduces (1) a Macro Graph Bottom that unifies cross-task expert associations and captures global graph topological features; and (2) a dynamic macro-prediction tower that enables task-specific macro-graph representation learning and adaptive cross-task knowledge fusion. The method integrates graph neural networks, multi-task expert networks, and a dynamic weight ensemble mechanism. Results: Our approach achieves significant improvements over state-of-the-art methods on three public benchmarks. Deployed in the homepage recommendation system of a leading industrial platform, online A/B tests demonstrate a +2.1% lift in CTR and a +3.4% increase in user session duration.

Technology Category

Application Category

📝 Abstract
Graph-based multi-task learning at billion-scale presents a significant challenge, as different tasks correspond to distinct billion-scale graphs. Traditional multi-task learning methods often neglect these graph structures, relying solely on individual user and item embeddings. However, disregarding graph structures overlooks substantial potential for improving performance. In this paper, we introduce the Macro Graph of Expert (MGOE) framework, the first approach capable of leveraging macro graph embeddings to capture task-specific macro features while modeling the correlations between task-specific experts. Specifically, we propose the concept of a Macro Graph Bottom, which, for the first time, enables multi-task learning models to incorporate graph information effectively. We design the Macro Prediction Tower to dynamically integrate macro knowledge across tasks. MGOE has been deployed at scale, powering multi-task learning for the homepage of a leading billion-scale recommender system. Extensive offline experiments conducted on three public benchmark datasets demonstrate its superiority over state-of-the-art multi-task learning methods, establishing MGOE as a breakthrough in multi-task graph-based recommendation. Furthermore, online A/B tests confirm the superiority of MGOE in billion-scale recommender systems.
Problem

Research questions and friction points this paper is trying to address.

Billion-scale multi-task recommendation with graph structures
Leveraging macro graph embeddings for task-specific features
Dynamic integration of macro knowledge across tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Macro Graph of Experts captures task-specific features
Macro Graph Bottom enables graph-based multi-task learning
Macro Prediction Tower dynamically integrates cross-task knowledge
H
Hongyu Yao
Jinan University, Guangzhou, China
Zijin Hong
Zijin Hong
The Hong Kong Polytechnic University
Text-to-SQLLarge Language ModelsNatural Language Processing
H
Hao Chen
City University of Macau, Macau SAR, China
Y
Yuan-Qi Bei
Zhejiang University, Zhejiang, China
Z
Zhiqing Li
South China Agricultural University, Guangzhou, China
Qijie Shen
Qijie Shen
Alibaba Group
Recommender SystemGraph Neural NetworksLarge Language Model
Z
Zuobin Ying
City University of Macau, Macau SAR, China
H
Huan Gong
National University of Defense Technology, Changsha, China
Feiran Huang
Feiran Huang
Professor, Jinan University
Recommender systemsText-to-SQLSentiment AnalysisLLMsMultimodal Learning