🤖 AI Summary
This study addresses the problem of symmetric private information retrieval (SPIR) in graph-based replicated databases, where each message is stored exclusively on two adjacent servers. For the first time, it analyzes the impact of two types of public randomness—graph-replicated and fully replicated—on SPIR capacity. Leveraging information-theoretic and graph-theoretic techniques, the work establishes a general lower bound on capacity and proves its tightness for path graphs and regular graphs. It further demonstrates that fully replicated public randomness can strictly enhance achievable capacity. The paper also presents an explicit PIR scheme attaining this capacity and shows that the minimum required size of public randomness equals the size of a single message.
📝 Abstract
In symmetric private information retrieval (SPIR), a user communicates with multiple servers to retrieve from them a message in a database, while not revealing the message index to any individual server (user privacy), and learning no additional information about the database (database privacy). We study the problem of SPIR on graph-replicated database systems, where each node of the graph represents a server and each link represents a message. Each message is replicated at exactly two servers; those at which the link representing the message is incident. To ensure database privacy, the servers share a set of common randomness, independent of the database and the user's desired message index. We study two cases of common randomness distribution to the servers: i) graph-replicated common randomness, and ii) fully-replicated common randomness. Given a graph-replicated database system, in i), we assign one randomness variable independently to every pair of servers sharing a message, while in ii), we assign an identical set of randomness variable to all servers, irrespective of the underlying graph. In both settings, our goal is to characterize the SPIR capacity, i.e., the maximum number of desired message symbols retrieved per downloaded symbol, and quantify the minimum amount of common randomness required to achieve the capacity. To this goal, in setting i), we derive a general lower bound on the SPIR capacity, and show it to be tight for path and regular graphs through a matching converse. Moreover, we establish that the minimum size of common randomness required for SPIR is equal to the message size. In setting ii), the SPIR capacity improves over the first, more restrictive setting. We show this through capacity lower bounds for a class of graphs, by constructing SPIR schemes from PIR schemes.