An Optimal Sorting Algorithm for Persistent Random Comparison Faults

📅 2025-08-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies sorting under persistent stochastic comparison errors: each comparison errs with a fixed small probability $p < 1/4$, and repeated comparisons of the same pair yield identical (hence irreversible) outcomes, making perfect sorting impossible. The objective is to minimize element displacement—the absolute difference between an element’s output position and its true rank. The authors present the first $O(n log n)$-time algorithm achieving $O(log n)$ maximum displacement and $O(n)$ total displacement. They prove these displacement bounds are optimal under the $O(n log n)$ time constraint, resolving a long-standing complexity bottleneck in this model. The algorithm leverages robust insertion into approximately sorted sequences and displacement correction, integrating recursive construction with refined probabilistic analysis. It demonstrates that, under moderate noise levels, comparison errors do not increase the asymptotic time complexity of sorting.

Technology Category

Application Category

📝 Abstract
We consider the problem of sorting $n$ elements subject to persistent random comparison errors. In this problem, each comparison between two elements can be wrong with some fixed (small) probability $p$, and comparing the same pair of elements multiple times always yields the same result. Sorting perfectly in this model is impossible, and the objective is to minimize the dislocation of each element in the output sequence, i.e., the difference between its position in the sequence and its true rank. In this paper, we present the first $O(nlog n)$-time sorting algorithm that guarantees both $O(log n)$ maximum dislocation and $O(n)$ total dislocation with high probability when $p<frac{1}{4}$. This settles the time complexity sorting with persistent comparison errors in the given range of $p$ and shows that comparison errors do not increase its computational difficulty. Indeed, $Ω(nlog n)$ time is necessary to archive a maximum dislocation of $O(log n)$ even without comparison errors. Moreover, we prove that no algorithm can guarantee a maximum dislocation of $o(log n)$ with high probability, nor a total dislocation of $o(n)$ in expectation. To develop our sorting algorithm, we solve two related sub-problems, which might be of independent interest. More precisely, we show that $O(log n)$ time suffices to find a position in which to insert a new element $x$ in an almost-sorted sequence $S$ of $n$ elements having dislocation at most $d=Ω(log n)$, so that the dislocation of $x$ in the resulting sequence is $O(d)$ with high probability (which can be equivalently thought as the problem of estimating the rank of $x$ in $S$). We also show that the maximum (resp. total) dislocation of an approximately sorted sequence $S$ of $n$ elements can be lowered to $O(log n)$ (resp. $O(n)$) in $O(nd)$ time, w.h.p., where $d$ is an upper bound on the maximum dislocation of $S$.
Problem

Research questions and friction points this paper is trying to address.

Sorting elements with persistent random comparison errors
Minimizing maximum and total dislocation in output sequence
Achieving optimal O(n log n) time complexity despite errors
Innovation

Methods, ideas, or system contributions that make the work stand out.

O(n log n) time algorithm for persistent errors
Guarantees O(log n) maximum dislocation
Achieves O(n) total dislocation with high probability
🔎 Similar Papers
No similar papers found.