🤖 AI Summary
This work addresses the challenge of recovering Euclidean distance matrices (EDMs) from sparse and noisy observations, a critical problem in applications such as sensor network localization and acoustic room reconstruction. The authors propose a hierarchical Bayesian framework that embeds geometric constraints directly into the model by placing structured priors on the latent point configurations that generate the EDM. This approach enables automatic regularization and robust noise suppression, circumventing the performance degradation commonly observed in conventional methods under conditions of high sparsity or severe noise. Posterior inference is carried out via a Metropolis–Hastings within Gibbs sampling algorithm. Experimental results on synthetic data demonstrate that the proposed method significantly outperforms existing deterministic baselines, particularly in highly sparse observation regimes, achieving markedly improved reconstruction accuracy.
📝 Abstract
The completion of a Euclidean distance matrix (EDM) from sparse and noisy observations is a fundamental challenge in signal processing, with applications in sensor network localization, acoustic room reconstruction, molecular conformation, and manifold learning. Traditional approaches, such as rank-constrained optimization and semidefinite programming, enforce geometric constraints but often struggle under sparse or noisy conditions. This paper introduces a hierarchical Bayesian framework that places structured priors directly on the latent point set generating the EDM, naturally embedding geometric constraints. By incorporating a hierarchical prior on latent point set, the model enables automatic regularization and robust noise handling. Posterior inference is performed using a Metropolis-Hastings within Gibbs sampler to handle coupled latent point posterior. Experiments on synthetic data demonstrate improved reconstruction accuracy compared to deterministic baselines in sparse regimes.