Distributionally Robust Graph Out-of-Distribution Recommendation via Diffusion Model

📅 2025-01-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing chart recommendation systems exhibit poor generalization under out-of-distribution (OOD) scenarios and are highly susceptible to noisy samples in training data. To address these limitations, we propose DRGO—a Distributionally Robust Graph Recommendation framework. DRGO is the first to integrate diffusion models into graph-based recommendation to mitigate implicit-space noise. It introduces an entropy-regularized distributionally robust optimization (DRO) objective that explicitly suppresses the undue influence of noisy samples on the worst-case distribution. We further provide theoretical guarantees on generalization error bounds and conduct mechanistic analysis of noise suppression. Extensive experiments across four benchmark datasets and three types of distribution shifts demonstrate that DRGO consistently outperforms state-of-the-art baselines under both i.i.d. and OOD settings, achieving simultaneous improvements in recommendation accuracy and robustness to label noise.

Technology Category

Application Category

📝 Abstract
The distributionally robust optimization (DRO)-based graph neural network methods improve recommendation systems' out-of-distribution (OOD) generalization by optimizing the model's worst-case performance. However, these studies fail to consider the impact of noisy samples in the training data, which results in diminished generalization capabilities and lower accuracy. Through experimental and theoretical analysis, this paper reveals that current DRO-based graph recommendation methods assign greater weight to noise distribution, leading to model parameter learning being dominated by it. When the model overly focuses on fitting noise samples in the training data, it may learn irrelevant or meaningless features that cannot be generalized to OOD data. To address this challenge, we design a Distributionally Robust Graph model for OOD recommendation (DRGO). Specifically, our method first employs a simple and effective diffusion paradigm to alleviate the noisy effect in the latent space. Additionally, an entropy regularization term is introduced in the DRO objective function to avoid extreme sample weights in the worst-case distribution. Finally, we provide a theoretical proof of the generalization error bound of DRGO as well as a theoretical analysis of how our approach mitigates noisy sample effects, which helps to better understand the proposed framework from a theoretical perspective. We conduct extensive experiments on four datasets to evaluate the effectiveness of our framework against three typical distribution shifts, and the results demonstrate its superiority in both independently and identically distributed distributions (IID) and OOD.
Problem

Research questions and friction points this paper is trying to address.

Chart Recommendation
Noisy Data
Performance Degradation
Innovation

Methods, ideas, or system contributions that make the work stand out.

DRGO
Robustness
Noise Reduction
🔎 Similar Papers
No similar papers found.