FeDecider: An LLM-Based Framework for Federated Cross-Domain Recommendation

📅 2026-02-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing large language model (LLM)-based approaches in federated cross-domain recommendation, which are prone to overfitting from local low-rank adapters and suffer from difficulties in aligning cross-domain representations. To overcome these challenges, we propose FeDecider, the first framework to effectively leverage LLMs in this setting. FeDecider decouples client-side low-rank updates and aggregates only their directional components to suppress scale-induced noise, while introducing data-aware personalized weights to facilitate effective cross-domain knowledge fusion. Extensive experiments demonstrate that FeDecider significantly outperforms state-of-the-art methods across multiple cross-domain datasets, achieving superior recommendation performance without compromising user privacy.

Technology Category

Application Category

📝 Abstract
Federated cross-domain recommendation (Federated CDR) aims to collaboratively learn personalized recommendation models across heterogeneous domains while preserving data privacy. Recently, large language model (LLM)-based recommendation models have demonstrated impressive performance by leveraging LLMs' strong reasoning capabilities and broad knowledge. However, adopting LLM-based recommendation models in Federated CDR scenarios introduces new challenges. First, there exists a risk of overfitting with domain-specific local adapters. The magnitudes of locally optimized parameter updates often vary across domains, causing biased aggregation and overfitting toward domain-specific distributions. Second, unlike traditional recommendation models (e.g., collaborative filtering, bipartite graph-based methods) that learn explicit and comparable user/item representations, LLMs encode knowledge implicitly through autoregressive text generation training. This poses additional challenges for effectively measuring the cross-domain similarities under heterogeneity. To address these challenges, we propose an LLM-based framework for federated cross-domain recommendation, FeDecider. Specifically, FeDecider tackles the challenge of scale-specific noise by disentangling each client's low-rank updates and sharing only their directional components. To handle the need for flexible and effective integration, each client further learns personalized weights that achieve the data-aware integration of updates from other domains. Extensive experiments across diverse datasets validate the effectiveness of our proposed FeDecider.
Problem

Research questions and friction points this paper is trying to address.

Federated Cross-Domain Recommendation
Large Language Models
Overfitting
Cross-Domain Similarity
Data Heterogeneity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Cross-Domain Recommendation
Large Language Models
Low-Rank Adaptation
Directional Aggregation
Personalized Weight Integration