Towards True Work-Efficiency in Parallel Derandomization: MIS, Maximal Matching, and Hitting Set

📅 2025-04-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deterministic parallel derandomization has long suffered from inefficient work complexity, with existing deterministic algorithms significantly lagging behind their sequential counterparts. Method: We propose a novel iterative rounding framework that integrates locally solvable constrained optimization, hierarchical sampling, and multi-scale potential function analysis, enabling low-depth parallel primitives under the PRAM model. Contribution/Results: For the first time, we reduce the work complexity of deterministic parallel algorithms for fundamental combinatorial problems—including maximum independent set, maximum matching, and hitting set—from $m cdot mathrm{poly}(log n)$ to $m cdot mathrm{poly}(log log n)$, while maintaining $mathrm{poly}(log n)$ depth. This is the first parallel derandomization scheme that surpasses naive sequential algorithms on standard processor scales, achieving exponential work improvement and breaking a decades-old barrier.

Technology Category

Application Category

📝 Abstract
Derandomization is one of the classic topics studied in the theory of parallel computations, dating back to the early 1980s. Despite much work, all known techniques lead to deterministic algorithms that are not work-efficient. For instance, for the well-studied problem of maximal independent set -- e.g., [Karp, Wigderson STOC'84; Luby STOC' 85; Luby FOCS'88] -- state-of-the-art deterministic algorithms require at least $m cdot poly(log n)$ work, where $m$ and $n$ denote the number of edges and vertices. Hence, these deterministic algorithms will remain slower than their trivial sequential counterparts unless we have at least $poly(log n)$ processors. In this paper, we present a generic parallel derandomization technique that moves exponentially closer to work-efficiency. The method iteratively rounds fractional solutions representing the randomized assignments to integral solutions that provide deterministic assignments, while maintaining certain linear or quadratic objective functions, and in an extit{essentially work-efficient} manner. As example end-results, we use this technique to obtain deterministic algorithms with $m cdot poly(log log n)$ work and $poly(log n)$ depth for problems such as maximal independent set, maximal matching, and hitting set.
Problem

Research questions and friction points this paper is trying to address.

Achieving work-efficient parallel derandomization for classic problems
Reducing work complexity from polynomial to poly-logarithmic levels
Developing deterministic algorithms for MIS, matching, and hitting set
Innovation

Methods, ideas, or system contributions that make the work stand out.

Iterative rounding of fractional solutions
Maintaining linear or quadratic objectives
Essentially work-efficient parallel derandomization
🔎 Similar Papers
No similar papers found.