Bregman-Hausdorff divergence: strengthening the connections between computational geometry and machine learning

📅 2025-04-09
📈 Citations: 0
✨ Influential: 0
📄 PDF
🤖 AI Summary
The classical Hausdorff distance is inapplicable in asymmetric distance spaces—particularly on statistical manifolds endowed with Bregman divergences—due to its inherent symmetry requirement. Method: We propose the first *Bregman–Hausdorff divergence*, a principled generalization of the Hausdorff distance to asymmetric information-geometric settings. Our approach integrates Bregman geometry, probabilistic forecasting modeling, and computational geometry to design scalable, efficient algorithms capable of measuring similarity between sets of probabilistic predictions even in up to 100-dimensional input spaces. Contribution/Results: (1) We establish the first rigorously defined, asymmetric Hausdorff-type divergence grounded in Bregman divergences; (2) we provide theoretical guarantees—including consistency and metric-like properties—and practical, numerically stable algorithms; (3) we empirically validate the framework on multi-model probabilistic forecasting comparison tasks, demonstrating substantial improvements in capturing geometric structure and enhancing cross-disciplinary modeling at the intersection of information geometry and machine learning.

Technology Category

Application Category

📝 Abstract
The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on the family of Bregman divergences, which includes the popular Kullback--Leibler divergence (also known as relative entropy). As a proof of concept, we use the resulting Bregman--Hausdorff divergence to compare two collections of probabilistic predictions produced by different machine learning models trained using the relative entropy loss. The algorithms we propose are surprisingly efficient even for large inputs with hundreds of dimensions. In addition to the introduction of this technical concept, we provide a survey. It outlines the basics of Bregman geometry, as well as computational geometry algorithms. We focus on algorithms that are compatible with this geometry and are relevant for machine learning.
Problem

Research questions and friction points this paper is trying to address.

Extend Hausdorff distance to asymmetric Bregman divergences
Compare probabilistic predictions from machine learning models
Survey Bregman geometry algorithms for machine learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extend Hausdorff distance to asymmetric spaces
Apply Bregman-Hausdorff divergence to compare predictions
Develop efficient algorithms for high-dimensional inputs
🔎 Similar Papers
No similar papers found.
T
Tuyen Pham
University of Florida, Gainesville, US
H
Hana Dal Poz Kouvrimsk'a
University of Potsdam, Potsdam, Germany
Hubert Wagner
Hubert Wagner
University of Florida
topological data analysiscomputational geometry and topologycomputational mathematics