🤖 AI Summary
This work investigates the average-case computational complexity relationship between noisy $k$-XOR and Tensor Principal Component Analysis (Tensor PCA). By constructing a family of interpolation problems bridging the two and introducing two density-amplifying reduction techniques, the authors achieve the first polynomial-time average-case reduction from arbitrary $k$-XOR to Tensor PCA below the computational threshold, while simultaneously controlling signal decay and enhancing observation density. The framework further enables order reduction at fixed density—such as reducing 5-XOR to 4-XOR or 7-order Tensor PCA to 4-order—thereby establishing computational equivalence between the two problems at the threshold. These results uncover an intrinsic hardness connection and significantly extend the reducibility frontier for high-order tensor problems.
📝 Abstract
We study two canonical planted average-case problems -- noisy $k\mathsf{\text{-}XOR}$ and Tensor PCA -- and relate their computational properties via poly-time average-case reductions. In fact, we consider a \emph{family of problems} that interpolates between $k\mathsf{\text{-}XOR}$ and Tensor PCA, allowing intermediate densities and signal levels. We introduce two \emph{densifying} reductions that increase the number of observed entries while controlling the decrease in signal, and, in particular, reduce any $k\mathsf{\text{-}XOR}$ instance at the computational threshold to Tensor PCA at the computational threshold. Additionally, we give new order-reducing maps (e.g., $5\to 4$ $k\mathsf{\text{-}XOR}$ and $7\to 4$ Tensor PCA) at fixed entry density.