Don't Start Over: A Cost-Effective Framework for Migrating Personalized Prompts Between LLMs

📅 2026-01-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that updates to large language models often render user-customized soft prompts obsolete, necessitating costly full retraining. To overcome this, we propose PUMA, a lightweight framework that enables efficient transfer of personalized prompts across incompatible large language model architectures for the first time. PUMA integrates parameter-efficient fine-tuning adapters with a grouped user selection strategy, supporting complex migration scenarios such as chaining and aggregation. Extensive experiments on three large-scale datasets demonstrate that PUMA achieves performance comparable to or even surpassing that of training from scratch, while reducing computational costs by up to 98%. The framework also exhibits strong generalization and robustness across diverse settings.

Technology Category

Application Category

📝 Abstract
Personalization in Large Language Models (LLMs) often relies on user-specific soft prompts. However, these prompts become obsolete when the foundation model is upgraded, necessitating costly, full-scale retraining. To overcome this limitation, we propose the Prompt-level User Migration Adapter (PUMA), a lightweight framework to efficiently migrate personalized prompts across incompatible models. PUMA utilizes a parameter-efficient adapter to bridge the semantic gap, combined with a group-based user selection strategy to significantly reduce training costs. Experiments on three large-scale datasets show our method matches or even surpasses the performance of retraining from scratch, reducing computational cost by up to 98%. The framework demonstrates strong generalization across diverse model architectures and robustness in advanced scenarios like chained and aggregated migrations, offering a practical path for the sustainable evolution of personalized AI by decoupling user assets from the underlying models.
Problem

Research questions and friction points this paper is trying to address.

personalized prompts
model migration
large language models
soft prompts
user personalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

prompt migration
parameter-efficient adaptation
personalized LLMs
user-specific soft prompts
model upgrading
🔎 Similar Papers
No similar papers found.