🤖 AI Summary
To address the challenge of robustly discovering low-dimensional manifolds embedded in high-dimensional data corrupted by noise and non-uniform sampling, this paper proposes Symmetric Sparse-Regularized Optimal Transport (SSROT), a method that constructs an adaptive sparse affinity matrix. Theoretically, SSROT’s continuous limit converges to the Laplace–Beltrami operator, and it exhibits intrinsic robustness to heteroscedastic noise. Methodologically, SSROT integrates symmetric optimal transport modeling, quadratic sparsity regularization, bistochastic kernel extension, and an efficient discrete optimal transport solver. Evaluated on multiple real-world benchmark datasets, SSROT consistently outperforms state-of-the-art manifold learning algorithms. It achieves a principled balance among theoretical consistency—guaranteed by spectral convergence—geometric fidelity—preserving intrinsic manifold structure—and computational efficiency—enabled by scalable sparse OT optimization. This work establishes a novel paradigm for manifold structure recovery under realistic, noisy, and irregularly sampled conditions.
📝 Abstract
Manifold learning is a central task in modern statistics and data science. Many datasets (cells, documents, images, molecules) can be represented as point clouds embedded in a high dimensional ambient space, however the degrees of freedom intrinsic to the data are usually far fewer than the number of ambient dimensions. The task of detecting a latent manifold along which the data are embedded is a prerequisite for a wide family of downstream analyses. Real-world datasets are subject to noisy observations and sampling, so that distilling information about the underlying manifold is a major challenge. We propose a method for manifold learning that utilises a symmetric version of optimal transport with a quadratic regularisation that constructs a sparse and adaptive affinity matrix, that can be interpreted as a generalisation of the bistochastic kernel normalisation. We prove that the resulting kernel is consistent with a Laplace-type operator in the continuous limit, establish robustness to heteroskedastic noise and exhibit these results in simulations. We identify a highly efficient computational scheme for computing this optimal transport for discrete data and demonstrate that it outperforms competing methods in a set of examples.