π€ AI Summary
Weighted Low-Rank Approximation (WLRA) seeks a rank-$k$ matrix $XY^ op$ minimizing the weighted Frobenius norm $|W circ (M - XY^ op)|_F$, given a matrix $M$ and a nonnegative weight matrix $W$. This problem is NP-hard and hard to approximate. This paper proposes the first alternating minimization framework for WLRA that simultaneously achieves strong theoretical guarantees and high efficiency. Our method integrates a high-accuracy multi-response regression solver into each alternating update step, enabling approximate yet controllable subproblem solving. Crucially, it preserves global convergence while reducing the per-iteration time complexity from $O(|W|_0 k^2)$ to $O(|W|_0 k)$, where $|W|_0$ denotes the number of nonzero entries in $W$βyielding substantial speedups for sparse weighting patterns. Experiments demonstrate state-of-the-art performance on matrix completion and noise-robust recovery tasks.
π Abstract
Weighted low rank approximation is a fundamental problem in numerical linear algebra, and it has many applications in machine learning. Given a matrix $M in mathbb{R}^{n imes n}$, a non-negative weight matrix $W in mathbb{R}_{geq 0}^{n imes n}$, a parameter $k$, the goal is to output two matrices $X,Yin mathbb{R}^{n imes k}$ such that $| W circ (M - X Y^ op) |_F$ is minimized, where $circ$ denotes the Hadamard product. It naturally generalizes the well-studied low rank matrix completion problem. Such a problem is known to be NP-hard and even hard to approximate assuming the Exponential Time Hypothesis [GG11, RSW16]. Meanwhile, alternating minimization is a good heuristic solution for weighted low rank approximation. In particular, [LLR16] shows that, under mild assumptions, alternating minimization does provide provable guarantees. In this work, we develop an efficient and robust framework for alternating minimization that allows the alternating updates to be computed approximately. For weighted low rank approximation, this improves the runtime of [LLR16] from $|W|_0k^2$ to $|W|_0 k$ where $|W|_0$ denotes the number of nonzero entries of the weight matrix. At the heart of our framework is a high-accuracy multiple response regression solver together with a robust analysis of alternating minimization.