Euclidean Distance Matrix Completion via Asymmetric Projected Gradient Descent

📅 2025-04-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the sparse Euclidean Distance Matrix Completion (EDMC) problem—reconstructing the geometric configuration of a point set from partial pairwise distance observations. We propose an Asymmetric Projected Gradient Descent (APGD) algorithm based on the Burer–Monteiro factorization, which exploits the inherent low-rank structure of Euclidean distance matrices. Theoretically, we establish the first global convergence and exact recovery guarantees for EDMC without sample splitting, circumventing conventional assumptions such as the Restricted Isometry Property (RIP) or curvature conditions; instead, we introduce a novel upper bound to replace the random graph lemma. Exact reconstruction is achieved under Bernoulli sampling with probability $O(mu^2 r^3 kappa^2 n log n)$. Empirically, APGD exhibits linear convergence and complements s-stress optimization in performance, achieving a favorable trade-off between theoretical rigor and practical efficacy.

Technology Category

Application Category

📝 Abstract
This paper proposes and analyzes a gradient-type algorithm based on Burer-Monteiro factorization, called the Asymmetric Projected Gradient Descent (APGD), for reconstructing the point set configuration from partial Euclidean distance measurements, known as the Euclidean Distance Matrix Completion (EDMC) problem. By paralleling the incoherence matrix completion framework, we show for the first time that global convergence guarantee with exact recovery of this routine can be established given $mathcal{O}(mu^2 r^3 kappa^2 n log n)$ Bernoulli random observations without any sample splitting. Unlike leveraging the tangent space Restricted Isometry Property (RIP) and local curvature of the low-rank embedding manifold in some very recent works, our proof provides new upper bounds to replace the random graph lemma under EDMC setting. The APGD works surprisingly well and numerical experiments demonstrate exact linear convergence behavior in rich-sample regions yet deteriorates fast when compared with the performance obtained by optimizing the s-stress function, i.e., the standard but unexplained non-convex approach for EDMC, if the sample size is limited. While virtually matching our theoretical prediction, this unusual phenomenon might indicate that: (i) the power of implicit regularization is weakened when specified in the APGD case; (ii) the stabilization of such new gradient direction requires substantially more samples than the information-theoretic limit would suggest.
Problem

Research questions and friction points this paper is trying to address.

Reconstruct point set from partial Euclidean distances
Prove global convergence with exact recovery guarantee
Analyze performance vs sample size in gradient descent
Innovation

Methods, ideas, or system contributions that make the work stand out.

Asymmetric Projected Gradient Descent for EDMC
Global convergence with exact recovery guarantee
New upper bounds replacing random graph lemma