🤖 AI Summary
This work addresses the problem of robust low-rank matrix recovery for asymmetric matrices corrupted by gross outliers, where the true rank is unknown and no prior rank information is available. We propose the Over-Parameterized Preconditioned Subgradient Algorithm (OPSA), the first theoretically grounded preconditioned optimization method tailored to unknown-rank asymmetric matrices. OPSA bridges a critical gap in convergence analysis under the joint challenges of matrix asymmetry, absence of rank priors, and outlier corruption. Leveraging over-parameterized matrix factorization, robust matrix sensing modeling, and mixed-norm Restricted Isometry Property (RIP) analysis, OPSA achieves linear convergence without requiring knowledge of the true rank—moreover, its convergence rate is rank-independent. Extensive experiments demonstrate that OPSA maintains robustness and efficiency across varying degrees of over-parameterization and outlier contamination levels.
📝 Abstract
In this paper, we focus on a matrix factorization-based approach to recover low-rank {it asymmetric} matrices from corrupted measurements. We propose an {it Overparameterized Preconditioned Subgradient Algorithm (OPSA)} and provide, for the first time in the literature, linear convergence rates independent of the rank of the sought asymmetric matrix in the presence of gross corruptions. Our work goes beyond existing results in preconditioned-type approaches addressing their current limitation, i.e., the lack of convergence guarantees in the case of {it asymmetric matrices of unknown rank}. By applying our approach to (robust) matrix sensing, we highlight its merits when the measurement operator satisfies a mixed-norm restricted isometry property. Lastly, we present extensive numerical experiments that validate our theoretical results and demonstrate the effectiveness of our approach for different levels of overparameterization and outlier corruptions.