Structure-Preserving Multi-View Embedding Using Gromov-Wasserstein Optimal Transport

📅 2026-04-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of effectively fusing multi-view data under heterogeneous geometries or nonlinear distortions, where traditional methods often fail to recover a consistent low-dimensional structure. The study introduces Gromov–Wasserstein (GW) optimal transport into multi-view embedding for the first time, proposing two geometry-aware strategies. Mean-GWMDS aligns distance matrices from individual views via GW coupling, averages them, and applies multidimensional scaling to obtain a unified embedding. Multi-GWMDS, in contrast, generates multiple geometrically consistent candidate embeddings and selects the optimal one. Notably, both approaches operate without explicit feature alignment or concatenation, enabling robust handling of nonlinearities and heterogeneous geometries. Experiments on synthetic manifolds and real-world datasets demonstrate superior cross-view structural preservation compared to existing methods.
📝 Abstract
Multi-view data analysis seeks to integrate multiple representations of the same samples in order to recover a coherent low-dimensional structure. Classical approaches often rely on feature concatenation or explicit alignment assumptions, which become restrictive under heterogeneous geometries or nonlinear distortions. In this work, we propose two geometry-aware multi-view embedding strategies grounded in Gromov-Wasserstein (GW) optimal transport. The first, termed Mean-GWMDS, aggregates view-specific relational information by averaging distance matrices and applying GW-based multidimensional scaling to obtain a representative embedding. The second strategy, referred to as Multi-GWMDS, adopts a selection-based paradigm in which multiple geometry-consistent candidate embeddings are generated via GW-based alignment and a representative embedding is selected. Experiments on synthetic manifolds and real-world datasets show that the proposed methods effectively preserve intrinsic relational structure across views. These results highlight GW-based approaches as a flexible and principled framework for multi-view representation learning.
Problem

Research questions and friction points this paper is trying to address.

multi-view embedding
heterogeneous geometries
nonlinear distortions
structure preservation
relational structure
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gromov-Wasserstein optimal transport
multi-view embedding
geometry-aware learning
relational structure preservation
multidimensional scaling
🔎 Similar Papers
No similar papers found.