Persistence Spheres: a Bi-continuous Linear Representation of Measures for Partial Optimal Transport

📅 2026-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of representing persistence diagrams and related measures under partial optimal transport (POT) by introducing "Persistence Spheres"—an explicit embedding that maps integrable measures on the upper half-plane into a space of continuous functions on the sphere, leveraging convex geometry and ReLU integrals. This representation is the first in topological machine learning to be stable with respect to the POT₁ distance, linear, and endowed with a continuous inverse that is compactly supported everywhere. It inherently encodes a deletion mechanism, naturally capturing the optimal transport behavior to the diagonal under persistence-aware costs without requiring hyperparameter tuning. Empirical evaluations demonstrate that Persistence Spheres match or outperform established baselines—including persistence images, landscapes, splines, and sliced Wasserstein kernels—across clustering, regression, and classification tasks on diverse data types such as functional data, time series, graphs, meshes, and point clouds.

Technology Category

Application Category

📝 Abstract
We improve and extend persistence spheres, introduced in~\cite{pegoraro2025persistence}. Persistence spheres map an integrable measure $μ$ on the upper half-plane, including persistence diagrams (PDs) as counting measures, to a function $S(μ)\in C(\mathbb{S}^2)$, and the map is stable with respect to 1-Wasserstein partial transport distance $\mathrm{POT}_1$. Moreover, to the best of our knowledge, persistence spheres are the first explicit representation used in topological machine learning for which continuity of the inverse on the image is established at every compactly supported target. Recent bounded-cardinality bi-Lipschitz embedding results in partial transport spaces, despite being powerful, are not given by the kind of explicit summary map considered here. Our construction is rooted in convex geometry: for positive measures, the defining ReLU integral is the support function of the lift zonoid. Building on~\cite{pegoraro2025persistence}, we refine the definition to better match the $\mathrm{POT}_1$ deletion mechanism, encoding partial transport via a signed diagonal augmentation. In particular, for integrable $μ$, the uniform norm between $S(0)$ and $S(μ)$ depends only on the persistence of $μ$, without any need of ad-hoc re-weightings, reflecting optimal transport to the diagonal at persistence cost. This yields a parameter-free representation at the level of measures (up to numerical discretization), while accommodating future extensions where $μ$ is a smoothed measure derived from PDs (e.g., persistence intensity functions~\citep{wu2024estimation}). Across clustering, regression, and classification tasks involving functional data, time series, graphs, meshes, and point clouds, the updated persistence spheres are competitive and often improve upon persistence images, persistence landscapes, persistence splines, and sliced Wasserstein kernel baselines.
Problem

Research questions and friction points this paper is trying to address.

persistence spheres
partial optimal transport
topological machine learning
Wasserstein distance
measure representation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Persistence Spheres
Partial Optimal Transport
Topological Machine Learning
Support Function
Bi-continuous Representation
🔎 Similar Papers
No similar papers found.