🤖 AI Summary
This work addresses the problem of efficiently maintaining a maximal independent set (MIS) in a graph undergoing batched edge insertions and deletions in parallel. We present the first theoretically efficient parallel dynamic MIS algorithm, which maintains the lexicographically first MIS under a random vertex ordering. To simplify the theoretical analysis, we introduce a batch influence set framework that cleanly captures the propagation of changes. By combining parallel sub-round scheduling with a refined work analysis, our algorithm achieves an expected work bound of $O(b \log^3 n)$ and polylogarithmic depth for update batches of size $b$. Notably, even in the special case of single-edge updates, our approach improves upon the best-known sequential dynamic MIS algorithms.
📝 Abstract
We develop the first theoretically-efficient algorithm for maintaining the maximal independent set (MIS) of a graph in the parallel batch-dynamic setting. In this setting, a graph is updated with batches of edge insertions/deletions, and for each batch a parallel algorithm updates the maximal independent set to agree with the new graph. A batch-dynamic algorithm is considered efficient if it is work efficient (i.e., does no more asymptotic work than applying the updates sequentially) and has polylogarithmic depth (parallel time). In the sequential setting, the best known dynamic algorithms for MIS, by Chechik and Zhang (CZ) [FOCS19] and Behnezhad et al. (BDHSS) [FOCS19], take $O(\log^4 n)$ time per update in expectation. For a batch of $b$ updates, our algorithm has $O(b \log^3 n)$ expected work and polylogarithmic depth with high probability (whp). It therefore outperforms the best algorithm even in the sequential dynamic case ($b = 1)$.
As with the sequential dynamic MIS algorithms of CZ and BDHSS, our solution maintains a lexicographically first MIS based on a random ordering of the vertices. Their analysis relied on a result of Censor-Hillel, Haramaty and Karnin [PODC16] that bounded the ``influence set" for a single update, but surprisingly, the influence of a batch is not simply the union of the influence of each update therein. We therefore develop a new approach to analyze the influence set for a batch of updates. Our construction of the batch influence set is natural and leads to an arguably simpler analysis than prior work. We then instrument this construction to bound the work of our algorithm. To argue our depth is polylogarithmic, we prove that the number of subrounds our algorithm takes is the same as depth bounds on parallel static MIS.