Improving the Threshold for Finding Rank-1 Matrices in a Subspace

πŸ“… 2025-04-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work investigates the recoverability of $s$ rank-one matrices embedded in a linear subspace, aiming to precisely characterize the success/failure phase transition threshold for polynomial-time algorithms. We propose a novel framework combining spectral methods with random matrix theory, underpinned by algebraic-geometric genericity assumptions and asymptotic probabilistic analysis. Theoretically, we establish that exact recovery is achievable when the subspace dimension satisfies $R leq (1 - o(1))mn/2$, while failure is inevitable when $R geq (1 + o(1))mn/sqrt{2}$β€”yielding the first tight information-theoretic lower bound on failure, improving the prior best-known sufficient condition from $mn/4$ to $mn/2$, and pinning the phase transition to the interval $[mn/2,, mn/sqrt{2}]$. Numerical experiments strongly corroborate the predicted sharp phase transition. This advancement effectively doubles the capacity of higher-order tensor decomposition: for fourth-order tensors, the number of recoverable rank-one components is doubled.

Technology Category

Application Category

πŸ“ Abstract
We consider a basic computational task of finding $s$ planted rank-1 $m imes n$ matrices in a linear subspace $mathcal{U} subseteq mathbb{R}^{m imes n}$ where $dim(mathcal{U}) = R ge s$. The work of Johnston-Lovitz-Vijayaraghavan (FOCS 2023) gave a polynomial-time algorithm for this task and proved that it succeeds when ${R le (1-o(1))mn/4}$, under minimal genericity assumptions on the input. Aiming to precisely characterize the performance of this algorithm, we improve the bound to ${R le (1-o(1))mn/2}$ and also prove that the algorithm fails when ${R ge (1+o(1))mn/sqrt{2}}$. Numerical experiments indicate that the true breaking point is $R = (1+o(1))mn/sqrt{2}$. Our work implies new algorithmic results for tensor decomposition, for instance, decomposing order-4 tensors with twice as many components as before.
Problem

Research questions and friction points this paper is trying to address.

Finding s planted rank-1 matrices in subspace
Improving threshold for successful matrix recovery
Characterizing algorithm performance for tensor decomposition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Improved bound to R ≀ (1-o(1))mn/2
Proved algorithm fails at R β‰₯ (1+o(1))mn/√2
Enhanced tensor decomposition with more components
πŸ”Ž Similar Papers
No similar papers found.
J
Jeshu Dastidar
Department of Mathematics, University of California, Davis
T
Tait Weicht
Department of Mathematics, University of California, Davis
Alexander S. Wein
Alexander S. Wein
UC Davis
theoretical computer sciencestatisticsprobabilitymathematics of data science