A Smoothing Newton Method for Rank-one Matrix Recovery

📅 2025-07-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses phase retrieval of rank-one positive semidefinite matrices from rank-one measurements. To overcome key limitations of the Bures–Wasserstein gradient descent method—including nonsmoothness, numerical instability, and a gap between theoretical convergence rates and empirical performance—we propose a novel smoothed Newton framework. Specifically, we construct a differentiable approximation of the objective function via matrix smoothing and design a Newton-type iteration tailored to the Bures–Wasserstein geometry. We establish, for the first time, rigorous global superlinear convergence guarantees for this class of algorithms. Extensive experiments on synthetic data demonstrate that the proposed method achieves stable, rapid, and robust matrix recovery, significantly outperforming existing nonsmooth optimization approaches in both accuracy and efficiency.

Technology Category

Application Category

📝 Abstract
We consider the phase retrieval problem, which involves recovering a rank-one positive semidefinite matrix from rank-one measurements. A recently proposed algorithm based on Bures-Wasserstein gradient descent (BWGD) exhibits superlinear convergence, but it is unstable, and existing theory can only prove local linear convergence for higher rank matrix recovery. We resolve this gap by revealing that BWGD implements Newton's method with a nonsmooth and nonconvex objective. We develop a smoothing framework that regularizes the objective, enabling a stable method with rigorous superlinear convergence guarantees. Experiments on synthetic data demonstrate this superior stability while maintaining fast convergence.
Problem

Research questions and friction points this paper is trying to address.

Recover rank-one matrix from rank-one measurements
Stabilize Bures-Wasserstein gradient descent algorithm
Ensure superlinear convergence with smoothing framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Smoothing Newton Method for stability
Regularizes nonsmooth nonconvex objective
Superlinear convergence with rigorous guarantees
🔎 Similar Papers
No similar papers found.
Tyler Maunu
Tyler Maunu
Brandeis University
G
Gabriel Abreu
Department of Mathematics, Brandeis University