🤖 AI Summary
This work addresses the semantic mismatch between time series data and large language models, as well as the difficulty of federated learning in capturing discrete and recurring temporal patterns. To this end, the authors propose FeDPM, a novel framework that introduces discrete prototype memory into federated time series foundation models for the first time. By constructing local prototype memory priors, FeDPM aligns cross-domain memories to form a unified discrete latent space and incorporates domain-specific memory update mechanisms to balance shared knowledge with personalized modeling needs. Experimental results demonstrate that FeDPM significantly outperforms existing federated learning approaches across multiple time series tasks, effectively capturing the discrete semantic structure of temporal dynamics while achieving a unified balance between cross-domain collaboration and personalization.
📝 Abstract
Leveraging Large Language Models (LLMs) as federated learning (FL)-based time series foundation models offers a promising way to transfer the generalization capabilities of LLMs to time series data while preserving access to private data. However, the semantic misalignment between time-series data and the text-centric latent space of existing LLMs often leads to degraded performance. Meanwhile, the parameter-sharing mechanism in existing FL methods model heterogeneous cross-domain time-series data into a unified continuous latent space, which contradicts the fact that time-series semantics frequently manifest as discrete and recurring regimes. To address these limitations, we propose \textsc{FeDPM}, a federated framework for time-series foundation models based on discrete prototypical memories. Specifically, we learn local prototypical memory priors for intra-domain time-series data. We then align cross-domain memories to promote a unified discrete latent space and introduce a domain-specific memory update mechanism to balance shared and personalized prototypical knowledge. Extensive experiments demonstrate the efficiency and effectiveness of \textsc{FeDPM}. The code is publicly available at https://anonymous.4open.science/r/FedUnit-64D1.