π€ AI Summary
This paper studies the smoothed analysis complexity of the simplex method under Gaussian perturbations for linear programs with $d$ variables and $n$ constraints. Building upon a novel shadow-vertex analysis framework, the authors integrate geometric probability theory with smoothed analysis techniques. Their main contribution is the first improved upper bound on the expected runtime: $O(sigma^{-3/2} d^{13/4} log^{7/4} n)$, which significantly weakens the dependence on the noise magnitude $sigma$βpreviously the best bound scaled as $sigma^{-2}$. Notably, in the two-dimensional case ($d = 2$), they derive an almost-tight smoothed complexity bound, confirming both the generality and near-optimality potential of their approach. This result provides the strongest theoretical guarantee to date for the average-case efficiency of the simplex method under Gaussian smoothing.
π Abstract
The simplex method for linear programming is known to be highly efficient in practice, and understanding its performance from a theoretical perspective is an active research topic. The framework of smoothed analysis, first introduced by Spielman and Teng (JACM β04) for this purpose, defines the smoothed complexity of solving a linear program with d variables and n constraints as the expected running time when Gaussian noise of variance Ο2 is added to the LP data. We prove that the smoothed complexity of the simplex method is O(Οβ3/2 d13/4log7/4 n), improving the dependence on 1/Ο compared to the previous bound of O(Οβ2 d2βlogn). We accomplish this through a new analysis of the shadow bound, key to earlier analyses as well. Illustrating the power of our new method, we use our method to prove a nearly tight upper bound on the smoothed complexity of two-dimensional polygons.