Private List Learnability vs. Online List Learnability

📅 2025-06-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the relationship between differential privacy (DP) and online learning within the PAC list-learning framework, focusing on learnability conditions for $k$-list learning. While finite Littlestone dimension characterizes DP learnability in standard multiclass PAC learning, the authors show that finite $k$-Littlestone dimension is necessary but insufficient for DP $k$-list learnability. They construct a monotone function class that is online $k$-list learnable yet not DP $k$-list learnable, demonstrating a separation. To precisely characterize DP $k$-list learnability, they introduce the $k$-monotone dimension—a novel combinatorial complexity measure—and establish its strict hierarchy relative to the $k$-Littlestone dimension. Their results provide the first evidence of a fundamental decoupling between DP and online learnability in the list-learning setting, yielding a critical theoretical criterion for the limits of private machine learning.

Technology Category

Application Category

📝 Abstract
This work explores the connection between differential privacy (DP) and online learning in the context of PAC list learning. In this setting, a $k$-list learner outputs a list of $k$ potential predictions for an instance $x$ and incurs a loss if the true label of $x$ is not included in the list. A basic result in the multiclass PAC framework with a finite number of labels states that private learnability is equivalent to online learnability [Alon, Livni, Malliaris, and Moran (2019); Bun, Livni, and Moran (2020); Jung, Kim, and Tewari (2020)]. Perhaps surprisingly, we show that this equivalence does not hold in the context of list learning. Specifically, we prove that, unlike in the multiclass setting, a finite $k$-Littlestone dimensio--a variant of the classical Littlestone dimension that characterizes online $k$-list learnability--is not a sufficient condition for DP $k$-list learnability. However, similar to the multiclass case, we prove that it remains a necessary condition. To demonstrate where the equivalence breaks down, we provide an example showing that the class of monotone functions with $k+1$ labels over $mathbb{N}$ is online $k$-list learnable, but not DP $k$-list learnable. This leads us to introduce a new combinatorial dimension, the emph{$k$-monotone dimension}, which serves as a generalization of the threshold dimension. Unlike the multiclass setting, where the Littlestone and threshold dimensions are finite together, for $k>1$, the $k$-Littlestone and $k$-monotone dimensions do not exhibit this relationship. We prove that a finite $k$-monotone dimension is another necessary condition for DP $k$-list learnability, alongside finite $k$-Littlestone dimension. Whether the finiteness of both dimensions implies private $k$-list learnability remains an open question.
Problem

Research questions and friction points this paper is trying to address.

Explores connection between DP and online list learning
Shows finite k-Littlestone dimension insufficient for DP list learning
Introduces k-monotone dimension as necessary for DP learnability
Innovation

Methods, ideas, or system contributions that make the work stand out.

DP list learning differs from online list learning
Introduces k-monotone dimension for DP learnability
Finite k-Littlestone dimension is necessary but insufficient
🔎 Similar Papers