Differentially Private Gradient Flow based on the Sliced Wasserstein Distance for Non-Parametric Generative Modeling

📅 2023-12-13
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of balancing privacy preservation and generation quality in generative model training, this paper proposes a nonparametric gradient flow framework based on the Gaussian-smoothed sliced Wasserstein distance. By inherently embedding differential privacy into the stochastic differential equation (SDE) dynamics augmented with Wiener noise, it achieves, for the first time, a theoretical coupling between gradient flows and differential privacy—revealing an intrinsic unification between the SDE drift structure and Gaussian noise injection. The method directly models data distribution evolution in the space of probability measures, circumventing explicit parametric assumptions. Experiments demonstrate that under stringent low-privacy budgets (ε < 2), our approach reduces the Fréchet Inception Distance (FID) by 18.7% on CIFAR-10 compared to state-of-the-art private generative models, significantly improving image fidelity.
📝 Abstract
Safeguarding privacy in sensitive training data is paramount, particularly in the context of generative modeling. This can be achieved through either differentially private stochastic gradient descent or a differentially private metric for training models or generators. In this paper, we introduce a novel differentially private generative modeling approach based on a gradient flow in the space of probability measures. To this end, we define the gradient flow of the Gaussian-smoothed Sliced Wasserstein Distance, including the associated stochastic differential equation (SDE). By discretizing and defining a numerical scheme for solving this SDE, we demonstrate the link between smoothing and differential privacy based on a Gaussian mechanism, due to a specific form of the SDE's drift term. We then analyze the differential privacy guarantee of our gradient flow, which accounts for both the smoothing and the Wiener process introduced by the SDE itself. Experiments show that our proposed model can generate higher-fidelity data at a low privacy budget compared to a generator-based model, offering a promising alternative.
Problem

Research questions and friction points this paper is trying to address.

Privacy Protection
Generative Models
Differential Privacy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein Distance
Gradient Flow
Privacy-Preserving Generative Modeling
🔎 Similar Papers
No similar papers found.
I
Ilana Sebag
Criteo AI Lab, Paris, France; Miles Team, LAMSADE, Université Paris-Dauphine, PSL University, CNRS, Paris, France
M
Muni Sreenivas Pydi
Miles Team, LAMSADE, Université Paris-Dauphine, PSL University, CNRS, Paris, France
Jean-Yves Franceschi
Jean-Yves Franceschi
Criteo AI Lab
AIMachine LearningDeep Learning
Alain Rakotomamonjy
Alain Rakotomamonjy
Criteo AI Lab - Université de Rouen
Machine LearningOptimal TransportMachine ListeningBrain-Computer Interfaces
Mike Gartrell
Mike Gartrell
Lead AI Research Scientist, Sigma Nova
Generative AIDeterminantal ModelsNLPRecommender SystemsAI for Neuroscience
J
Jamal Atif
Miles Team, LAMSADE, Université Paris-Dauphine, PSL University, CNRS, Paris, France
Alexandre Allauzen
Alexandre Allauzen
Université Paris-Dauphine and ESPCI, PSL
Deep-LearningStatistical language modelingStatistical Machine Translation