Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein Projection

📅 2024-02-03
🏛️ arXiv.org
📈 Citations: 5
Influential: 1
📄 PDF
🤖 AI Summary
Conventional dimensionality reduction and clustering methods for high-dimensional data are often decoupled, hindering effective modeling of multi-scale structural patterns. Method: This paper proposes a unified framework based on distributional simplification, which— for the first time—embeds both tasks within the Gromov–Wasserstein (GW) optimal transport geometry. By modeling the intrinsic metric structure via GW projection, the framework jointly learns low-dimensional embeddings and multi-scale prototypes through a single-objective optimization. It integrates differentiable GW distance, distributional projection, and end-to-end learning, theoretically establishing the intrinsic equivalence between dimensionality reduction and clustering in GW space. Contribution/Results: Evaluated on multi-source image and genomic datasets, the method simultaneously enhances interpretability of dimensionality reduction and accuracy of clustering. It successfully identifies cross-scale, semantically coherent low-dimensional prototypes, demonstrating both effectiveness and generalizability of joint multi-scale structural modeling.

Technology Category

Application Category

📝 Abstract
Unsupervised learning aims to capture the underlying structure of potentially large and high-dimensional datasets. Traditionally, this involves using dimensionality reduction (DR) methods to project data onto lower-dimensional spaces or organizing points into meaningful clusters (clustering). In this work, we revisit these approaches under the lens of optimal transport and exhibit relationships with the Gromov-Wasserstein problem. This unveils a new general framework, called distributional reduction, that recovers DR and clustering as special cases and allows addressing them jointly within a single optimization problem. We empirically demonstrate its relevance to the identification of low-dimensional prototypes representing data at different scales, across multiple image and genomic datasets.
Problem

Research questions and friction points this paper is trying to address.

Unifies dimensionality reduction and clustering via Gromov-Wasserstein
Proposes joint optimization framework for structure discovery
Identifies scalable low-dimensional prototypes across diverse datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies dimensionality reduction and clustering
Uses Gromov-Wasserstein optimal transport framework
Joint optimization for multi-scale prototype identification
🔎 Similar Papers
No similar papers found.
H
Hugues van Assel
ENS de Lyon, UMPA UMR 5669
C
Cédric Vincent-Cuaz
EPFL, Lausanne LTS4
N
N. Courty
Université Bretagne Sud, IRISA UMR 6074
R
R'emi Flamary
École polytechnique, IP Paris, CMAP UMR 7641
Pascal Frossard
Pascal Frossard
Ecole Polytechnique Fédérale de Lausanne, EPFL
machine learningimage processingcomputer visionmultimedia communications
Titouan Vayer
Titouan Vayer
Inria
optimal transportgraphsinverse problems