🤖 AI Summary
This work addresses low-rank optimization problems—including orthogonal-constrained quadratic optimization and matrix completion—by introducing a novel relaxation-sampling paradigm. Methodologically, it pioneers a semidefinite relaxation framework inspired by the Goemans–Williamson approach, yielding a tighter and more general relaxation formulation; leveraging block-wise matrix sparsity, it substantially reduces the number of optimization variables, and integrates randomized rounding for efficient solution recovery. Theoretically, the method provides provable approximation guarantees for near-optimality. Empirically, it achieves superior accuracy and scalability: problem sizes solvable are improved by several orders of magnitude, significantly expanding the practical applicability of low-rank modeling to large-scale orthogonal optimization and matrix completion tasks.
📝 Abstract
Inspired by the impact of the Goemans-Williamson algorithm on combinatorial optimization, we construct an analogous relax-then-sample strategy for low-rank optimization problems. First, for orthogonally constrained quadratic optimization problems, we derive a semidefinite relaxation and a randomized rounding scheme, which obtains provably near-optimal solutions, mimicking the blueprint from Goemans and Williamson for the Max-Cut problem. We then extend our approach to generic low-rank optimization problems by developing new semidefinite relaxations that are both tighter and more broadly applicable than those in prior works. Although our original proposal introduces large semidefinite matrices as decision variables, we show that most of the blocks in these matrices can be safely omitted without altering the optimal value, hence improving the scalability of our approach. Using several examples (including matrix completion, basis pursuit, and reduced-rank regression), we show how to reduce the size of our relaxation even further. Finally, we numerically illustrate the effectiveness and scalability of our relaxation and our sampling scheme on orthogonally constrained quadratic optimization and matrix completion problems.