SecEmb: Sparsity-Aware Secure Federated Learning of On-Device Recommender System with Large Embedding

📅 2025-05-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address index leakage of rated items and excessive communication overhead caused by sparse embedding updates in federated recommendation systems (FedRec), this paper proposes the first privacy-lossless, end-to-end secure sparse aggregation framework. Our method integrates homomorphic encryption, oblivious transfer (OT), sparse index obfuscation, and embedding sharding retrieval to construct a lightweight security protocol stack. We design a dual-module mechanism—privacy-preserving embedding retrieval and secure update aggregation—that jointly ensures the server learns neither users’ rated-item indices nor individual embedding updates. Under resource-constrained edge settings, our framework reduces communication cost to merely 1/90 of existing secure FedRec schemes and decreases client-side computation time by 70×, while achieving significantly higher model utility than lossy compression-based approaches.

Technology Category

Application Category

📝 Abstract
Federated recommender system (FedRec) has emerged as a solution to protect user data through collaborative training techniques. A typical FedRec involves transmitting the full model and entire weight updates between edge devices and the server, causing significant burdens to devices with limited bandwidth and computational power. While the sparsity of embedding updates provides opportunity for payload optimization, existing sparsity-aware federated protocols generally sacrifice privacy for efficiency. A key challenge in designing a secure sparsity-aware efficient protocol is to protect the rated item indices from the server. In this paper, we propose a lossless secure recommender systems on sparse embedding updates (SecEmb). SecEmb reduces user payload while ensuring that the server learns no information about both rated item indices and individual updates except the aggregated model. The protocol consists of two correlated modules: (1) a privacy-preserving embedding retrieval module that allows users to download relevant embeddings from the server, and (2) an update aggregation module that securely aggregates updates at the server. Empirical analysis demonstrates that SecEmb reduces both download and upload communication costs by up to 90x and decreases user-side computation time by up to 70x compared with secure FedRec protocols. Additionally, it offers non-negligible utility advantages compared with lossy message compression methods.
Problem

Research questions and friction points this paper is trying to address.

Secure federated learning for on-device recommender systems
Reduce communication costs while preserving user privacy
Optimize sparse embedding updates without sacrificing efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sparsity-aware secure federated learning protocol
Privacy-preserving embedding retrieval module
Secure update aggregation for FedRec
🔎 Similar Papers
No similar papers found.