🤖 AI Summary
This paper investigates the computational complexity of the Shortest Vector Problem in the ℓₚ norm (SVPₚ) for finite p. For all p > 2, it establishes—under standard deterministic Karp reductions—the first unconditional NP-hardness of exact SVPₚ, resolving a long-standing open problem and improving upon prior results reliant on randomized assumptions. Moreover, it proves that SVPₚ is hard to approximate within factor 2^{log^{1−ε} n} for any ε > 0, i.e., no polynomial-time algorithm achieves this approximation ratio unless P = NP. Methodologically, the paper introduces a direct, elementary, and constructive reduction from the Regularized PCP to SVPₚ, relying solely on Vandermonde and Hadamard matrices—bypassing intricate coding-theoretic machinery. This reduction is both conceptually simple and technically robust. Furthermore, under the Sliding Scale Conjecture, the hardness extends to polynomial approximation factors. The result thus unifies and strengthens the inapproximability landscape for SVPₚ across ℓₚ norms with p > 2.
📝 Abstract
We prove that SVP$_p$ is NP-hard to approximate within a factor of $2^{log^{1 - varepsilon} n}$, for all constants $varepsilon>0$ and $p>2$, under standard deterministic Karp reductions. This result is also the first proof that emph{exact} SVP$_p$ is NP-hard in a finite $ell_p$ norm. Hardness for SVP$_p$ with $p$ finite was previously only known if NP $
ot subseteq$ RP, and under that assumption, hardness of approximation was only known for all constant factors. As a corollary to our main theorem, we show that under the Sliding Scale Conjecture, SVP$_p$ is NP-hard to approximate within a small polynomial factor, for all constants $p>2$. Our proof techniques are surprisingly elementary; we reduce from a emph{regularized} PCP instance directly to the shortest vector problem by using simple gadgets related to Vandermonde matrices and Hadamard matrices.