Gradients as an Action: Towards Communication-Efficient Federated Recommender Systems via Adaptive Action Sharing

๐Ÿ“… 2025-07-07
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Federated recommendation systems (FedRecs) face two critical challenges: high communication overhead induced by item embeddings and low training efficiency under device heterogeneity. Existing compression methods reduce communication costs but degrade recommendation accuracy due to parameter distortion. To address this, we propose FedRASโ€”a novel framework that reformulates item embedding gradients as shareable โ€œactions.โ€ By imposing gradient direction constraints and applying adaptive clustering, FedRAS aggregates numerous client gradients into a compact set of update actions, dynamically scaling the action count to accommodate heterogeneous network conditions and device capabilities. This design drastically reduces communication load without compromising model performance. Extensive experiments demonstrate that FedRAS achieves up to 96.88% communication reduction while maintaining state-of-the-art recommendation accuracy across diverse heterogeneous settings. The implementation is publicly available.

Technology Category

Application Category

๐Ÿ“ Abstract
As a promising privacy-aware collaborative model training paradigm, Federated Learning (FL) is becoming popular in the design of distributed recommender systems. However, Federated Recommender Systems (FedRecs) greatly suffer from two major problems: i) extremely high communication overhead due to massive item embeddings involved in recommendation systems, and ii) intolerably low training efficiency caused by the entanglement of both heterogeneous network environments and client devices. Although existing methods attempt to employ various compression techniques to reduce communication overhead, due to the parameter errors introduced by model compression, they inevitably suffer from model performance degradation. To simultaneously address the above problems, this paper presents a communication-efficient FedRec framework named FedRAS, which adopts an action-sharing strategy to cluster the gradients of item embedding into a specific number of model updating actions for communication rather than directly compressing the item embeddings. In this way, the cloud server can use the limited actions from clients to update all the items. Since gradient values are significantly smaller than item embeddings, constraining the directions of gradients (i.e., the action space) introduces smaller errors compared to compressing the entire item embedding matrix into a reduced space. To accommodate heterogeneous devices and network environments, FedRAS incorporates an adaptive clustering mechanism that dynamically adjusts the number of actions. Comprehensive experiments on well-known datasets demonstrate that FedRAS can reduce the size of communication payloads by up to 96.88%, while not sacrificing recommendation performance within various heterogeneous scenarios. We have open-sourced FedRAS at https://github.com/mastlab-T3S/FedRAS.
Problem

Research questions and friction points this paper is trying to address.

Reducing communication overhead in federated recommender systems
Improving training efficiency in heterogeneous network environments
Minimizing model performance degradation from compression errors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Action-sharing strategy clusters gradients
Adaptive clustering adjusts action numbers
Reduces payload size by 96.88%
๐Ÿ”Ž Similar Papers
No similar papers found.