Riemann$^2$: Learning Riemannian Submanifolds from Riemannian Data

📅 2025-03-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods for modeling data with Riemannian geometric structure—such as unit spheres or symmetric positive-definite matrices—often neglect intrinsic manifold constraints, resulting in latent representations that lack geometric consistency and metric interpretability. To address this, we propose the first framework integrating pullback metrics into Gaussian Process Latent Variable Models (GPLVMs), ensuring strict preservation of geodesic distances and manifold geometry in the latent space, while guaranteeing that the probabilistic support lies entirely on the target manifold. Our approach, termed Wrapped GPLVM, enables geometry-aware nonlinear dimensionality reduction. Evaluated on robot motion synthesis and brain connectome analysis, it significantly improves modeling accuracy, generation consistency, and inference interpretability. This work establishes a theoretically rigorous and practically effective paradigm for latent variable modeling of geometrically constrained data.

Technology Category

Application Category

📝 Abstract
Latent variable models are powerful tools for learning low-dimensional manifolds from high-dimensional data. However, when dealing with constrained data such as unit-norm vectors or symmetric positive-definite matrices, existing approaches ignore the underlying geometric constraints or fail to provide meaningful metrics in the latent space. To address these limitations, we propose to learn Riemannian latent representations of such geometric data. To do so, we estimate the pullback metric induced by a Wrapped Gaussian Process Latent Variable Model, which explicitly accounts for the data geometry. This enables us to define geometry-aware notions of distance and shortest paths in the latent space, while ensuring that our model only assigns probability mass to the data manifold. This generalizes previous work and allows us to handle complex tasks in various domains, including robot motion synthesis and analysis of brain connectomes.
Problem

Research questions and friction points this paper is trying to address.

Learn Riemannian submanifolds from geometric data
Address limitations in existing latent variable models
Enable geometry-aware distance and path metrics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wrapped Gaussian Process for Riemannian data
Geometry-aware latent space distance metrics
Probability mass restricted to data manifold
🔎 Similar Papers
No similar papers found.