Federated One-Shot Learning with Data Privacy and Objective-Hiding

📅 2025-04-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In federated learning, simultaneously protecting both client data privacy and the aggregation server’s objective (i.e., target) privacy remains an open challenge. Method: We propose the first one-shot dual-privacy framework, featuring a three-phase protocol that integrates secret sharing with graph-structured private information retrieval—bypassing classical limitations of secure function evaluation to linear or polynomial functions—and introduces a knowledge distillation–inspired privacy encoding strategy for implicit modeling of the target function. Contribution/Results: Under information-theoretic security guarantees, our framework is the first to provably hide both clients’ raw data and the global optimization objective. Experiments demonstrate significantly lower communication overhead and higher model accuracy compared to existing schemes adapted to this setting, validating the feasibility and practicality of target privacy protection.

Technology Category

Application Category

📝 Abstract
Privacy in federated learning is crucial, encompassing two key aspects: safeguarding the privacy of clients' data and maintaining the privacy of the federator's objective from the clients. While the first aspect has been extensively studied, the second has received much less attention. We present a novel approach that addresses both concerns simultaneously, drawing inspiration from techniques in knowledge distillation and private information retrieval to provide strong information-theoretic privacy guarantees. Traditional private function computation methods could be used here; however, they are typically limited to linear or polynomial functions. To overcome these constraints, our approach unfolds in three stages. In stage 0, clients perform the necessary computations locally. In stage 1, these results are shared among the clients, and in stage 2, the federator retrieves its desired objective without compromising the privacy of the clients' data. The crux of the method is a carefully designed protocol that combines secret-sharing-based multi-party computation and a graph-based private information retrieval scheme. We show that our method outperforms existing tools from the literature when properly adapted to this setting.
Problem

Research questions and friction points this paper is trying to address.

Ensuring data privacy in federated one-shot learning
Protecting federator's objective privacy from clients
Overcoming limitations of traditional private function computation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines knowledge distillation and private information retrieval
Uses three-stage secret-sharing-based computation protocol
Integrates graph-based private information retrieval scheme
🔎 Similar Papers
No similar papers found.