From Personal to Collective: On the Role of Local and Global Memory in LLM Personalization

📅 2025-09-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language model (LLM) personalization faces two key challenges: cold-start (insufficient historical interactions) and preference drift (strong historical biases), primarily due to the absence of explicit cross-user collective knowledge modeling. To address this, we propose LoGo—a local-global memory协同 framework—that for the first time explicitly incorporates collective knowledge into personalized LLM adaptation. Local memory encodes user-specific interaction histories, while global memory captures population-wide shared interests; a mediation module dynamically fuses these complementary signals to resolve conflicts. This design jointly preserves individual specificity and group-level commonality. Extensive experiments across multiple benchmarks demonstrate substantial improvements: higher response rates for cold-start users and significantly reduced overfitting for highly biased users. Results validate that cross-user knowledge transfer is essential for robust, generalizable personalization.

Technology Category

Application Category

📝 Abstract
Large language model (LLM) personalization aims to tailor model behavior to individual users based on their historical interactions. However, its effectiveness is often hindered by two key challenges: the extit{cold-start problem}, where users with limited history provide insufficient context for accurate personalization, and the extit{biasing problem}, where users with abundant but skewed history cause the model to overfit to narrow preferences. We identify both issues as symptoms of a common underlying limitation, i.e., the inability to model collective knowledge across users. To address this, we propose a local-global memory framework (LoGo) that combines the personalized local memory with a collective global memory that captures shared interests across the population. To reconcile discrepancies between these two memory sources, we introduce a mediator module designed to resolve conflicts between local and global signals. Extensive experiments on multiple benchmarks demonstrate that LoGo consistently improves personalization quality by both warming up cold-start users and mitigating biased predictions. These results highlight the importance of incorporating collective knowledge to enhance LLM personalization.
Problem

Research questions and friction points this paper is trying to address.

Addressing cold-start and biasing in LLM personalization
Modeling collective knowledge across users for personalization
Resolving conflicts between local and global memory signals
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combining local and global memory for personalization
Introducing mediator module to resolve memory conflicts
Enhancing cold-start and mitigating bias with collective knowledge
🔎 Similar Papers