π€ AI Summary
This work investigates the recoverability of $s$ rank-one matrices embedded in a linear subspace, aiming to precisely characterize the success/failure phase transition threshold for polynomial-time algorithms. We propose a novel framework combining spectral methods with random matrix theory, underpinned by algebraic-geometric genericity assumptions and asymptotic probabilistic analysis. Theoretically, we establish that exact recovery is achievable when the subspace dimension satisfies $R leq (1 - o(1))mn/2$, while failure is inevitable when $R geq (1 + o(1))mn/sqrt{2}$βyielding the first tight information-theoretic lower bound on failure, improving the prior best-known sufficient condition from $mn/4$ to $mn/2$, and pinning the phase transition to the interval $[mn/2,, mn/sqrt{2}]$. Numerical experiments strongly corroborate the predicted sharp phase transition. This advancement effectively doubles the capacity of higher-order tensor decomposition: for fourth-order tensors, the number of recoverable rank-one components is doubled.
π Abstract
We consider a basic computational task of finding $s$ planted rank-1 $m imes n$ matrices in a linear subspace $mathcal{U} subseteq mathbb{R}^{m imes n}$ where $dim(mathcal{U}) = R ge s$. The work of Johnston-Lovitz-Vijayaraghavan (FOCS 2023) gave a polynomial-time algorithm for this task and proved that it succeeds when ${R le (1-o(1))mn/4}$, under minimal genericity assumptions on the input. Aiming to precisely characterize the performance of this algorithm, we improve the bound to ${R le (1-o(1))mn/2}$ and also prove that the algorithm fails when ${R ge (1+o(1))mn/sqrt{2}}$. Numerical experiments indicate that the true breaking point is $R = (1+o(1))mn/sqrt{2}$. Our work implies new algorithmic results for tensor decomposition, for instance, decomposing order-4 tensors with twice as many components as before.