GeoDM: Geometry-aware Distribution Matching for Dataset Distillation

📅 2025-12-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing data distillation methods operate solely in Euclidean space, failing to capture the intrinsic nonlinear manifold structure—such as curvature—of real-world data. To address this, we propose GeoDM, the first geometrically aware data distillation framework operating on product manifolds combining Euclidean, hyperbolic, and spherical geometries. GeoDM jointly learns curvature parameters and geometric weights to unify modeling of flat, hierarchical, and periodic data structures. It performs distribution matching on the product manifold via optimal transport and differentiable manifold optimization, and theoretically establishes a tighter generalization error bound than prior approaches. Extensive experiments demonstrate that GeoDM consistently outperforms state-of-the-art methods across multiple benchmarks, achieving superior robustness, generalization, and seamless compatibility with single-geometry baselines.

Technology Category

Application Category

📝 Abstract
Dataset distillation aims to synthesize a compact subset of the original data, enabling models trained on it to achieve performance comparable to those trained on the original large dataset. Existing distribution-matching methods are confined to Euclidean spaces, making them only capture linear structures and overlook the intrinsic geometry of real data, e.g., curvature. However, high-dimensional data often lie on low-dimensional manifolds, suggesting that dataset distillation should have the distilled data manifold aligned with the original data manifold. In this work, we propose a geometry-aware distribution-matching framework, called extbf{GeoDM}, which operates in the Cartesian product of Euclidean, hyperbolic, and spherical manifolds, with flat, hierarchical, and cyclical structures all captured by a unified representation. To adapt to the underlying data geometry, we introduce learnable curvature and weight parameters for three kinds of geometries. At the same time, we design an optimal transport loss to enhance the distribution fidelity. Our theoretical analysis shows that the geometry-aware distribution matching in a product space yields a smaller generalization error bound than the Euclidean counterparts. Extensive experiments conducted on standard benchmarks demonstrate that our algorithm outperforms state-of-the-art data distillation methods and remains effective across various distribution-matching strategies for the single geometries.
Problem

Research questions and friction points this paper is trying to address.

Captures intrinsic geometry of real data
Aligns distilled data manifold with original
Enhances distribution fidelity across geometries
Innovation

Methods, ideas, or system contributions that make the work stand out.

Geometry-aware distribution matching across multiple manifolds
Learnable curvature and weight parameters for data geometry adaptation
Optimal transport loss to enhance distribution fidelity
🔎 Similar Papers
2023-11-30arXiv.orgCitations: 14
X
Xuhui Li
Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence, Abu dhabi, UAE
Z
Zhengquan Luo
Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence, Abu dhabi, UAE
Z
Zihui Cui
Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence, Abu dhabi, UAE
Zhiqiang Xu
Zhiqiang Xu
Professor, Academy of Math. And Sys. Sciences, Chinese Academy of Science
approximation theorycompressed sensingsplinesframe theoryquantization