N-Parties Private Structure and Parameter Learning for Sum-Product Networks

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses privacy-preserving Sum-Product Network (SPN) modeling in multi-party data collaboration scenarios. We propose the first secure multi-party SPN framework that jointly optimizes structural generation and parameter learning. Leveraging secret sharing and secure multi-party computation protocols, we construct a randomized SPN forest and design a secure weighted inference mechanism, ensuring end-to-end privacy preservation throughout the model’s lifecycle under the semi-honest adversary model. The framework supports fully private, end-to-end training and inference over distributed, heterogeneous datasets. Experiments demonstrate that our method achieves log-likelihood performance on par with non-private SPN baselines, while significantly outperforming existing privacy-preserving neural network approaches in both runtime efficiency and memory scalability. To the best of our knowledge, this is the first work to achieve high accuracy, high efficiency, and strong privacy guarantees simultaneously in multi-party SPN modeling.

Technology Category

Application Category

📝 Abstract
A sum-product network (SPN) is a graphical model that allows several types of probabilistic inference to be performed efficiently. In this paper, we propose a privacy-preserving protocol which tackles structure generation and parameter learning of SPNs. Additionally, we provide a protocol for private inference on SPNs, subsequent to training. To preserve the privacy of the participants, we derive our protocol based on secret sharing, which guarantees privacy in the honest-but-curious setting even when at most half of the parties cooperate to disclose the data. The protocol makes use of a forest of randomly generated SPNs, which is trained and weighted privately and can then be used for private inference on data points. Our experiments indicate that preserving the privacy of all participants does not decrease log-likelihood performance on both homogeneously and heterogeneously partitioned data. We furthermore show that our protocol's performance is comparable to current state-of-the-art SPN learners in homogeneously partitioned data settings. In terms of runtime and memory usage, we demonstrate that our implementation scales well when increasing the number of parties, comparing favorably to protocols for neural networks, when they are trained to reproduce the input-output behavior of SPNs.
Problem

Research questions and friction points this paper is trying to address.

Private structure and parameter learning for sum-product networks
Privacy-preserving protocol using secret sharing techniques
Scalable multi-party computation without performance degradation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Privacy-preserving protocol for SPN structure and parameter learning
Secret sharing ensures privacy in honest-but-curious setting
Forest of randomly generated SPNs enables private inference
🔎 Similar Papers
No similar papers found.