🤖 AI Summary
This paper addresses the robust recovery of Hankel matrices under sparse outliers and massive missing entries. To tackle this ill-posed low-rank structured recovery problem, we propose the Hankel-Structured Newton-like Descent (HSNLD) algorithm: it models robust loss directly on the low-rank Hankel manifold and employs a structured Newton approximation for efficient optimization. Theoretically, HSNLD achieves linear convergence with a rate independent of the Hankel matrix’s condition number—overcoming a key limitation of conventional methods that suffer from ill-conditioning. Moreover, we establish lightweight nonconvex optimization convergence guarantees. Experiments on both synthetic and real-world data demonstrate that HSNLD significantly outperforms state-of-the-art approaches, delivering both faster convergence and higher recovery accuracy.
📝 Abstract
This paper studies the robust Hankel recovery problem, which simultaneously removes the sparse outliers and fulfills missing entries from the partial observation. We propose a novel non-convex algorithm, coined Hankel Structured Newton-Like Descent (HSNLD), to tackle the robust Hankel recovery problem. HSNLD is highly efficient with linear convergence, and its convergence rate is independent of the condition number of the underlying Hankel matrix. The recovery guarantee has been established under some mild conditions. Numerical experiments on both synthetic and real datasets show the superior performance of HSNLD against state-of-the-art algorithms.