Distributed Algorithms for Euclidean Clustering

📅 2026-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the Euclidean $(k,z)$-clustering problem in distributed settings and presents the first $(1+\varepsilon)$-strong coreset construction that avoids explicit transmission of raw coordinates. By integrating constant-factor approximations, efficient coreset construction, and compact encoding strategies within both the coordinator and blackboard communication models, the proposed method substantially reduces communication costs. Theoretical analysis shows that the communication complexity in the coordinator model is $\widetilde{O}(sk + dk/\min(\varepsilon^4, \varepsilon^{2+z}) + dk\log(n\Delta))$, while the blackboard model further improves this to $\widetilde{O}(s\log(n\Delta) + dk\log(n\Delta) + dk/\min(\varepsilon^4, \varepsilon^{2+z}))$, nearly matching the theoretical lower bound. Notably, this is the first algorithm to achieve a $(1+\varepsilon)$-approximation in the blackboard model for this problem.

Technology Category

Application Category

📝 Abstract
We study the problem of constructing $(1+\varepsilon)$-coresets for Euclidean $(k,z)$-clustering in the distributed setting, where $n$ data points are partitioned across $s$ sites. We focus on two prominent communication models: the coordinator model and the blackboard model. In the coordinator model, we design a protocol that achieves a $(1+\varepsilon)$-strong coreset with total communication complexity $\tilde{O}\left(sk + \frac{dk}{\min(\varepsilon^4,\varepsilon^{2+z})} + dk\log(n\Delta)\right)$ bits, improving upon prior work (Chen et al., NeurIPS 2016) by eliminating the need to communicate explicit point coordinates in-the-clear across all servers. In the blackboard model, we further reduce the communication complexity to $\tilde{O}\left(s\log(n\Delta) + dk\log(n\Delta) + \frac{dk}{\min(\varepsilon^4,\varepsilon^{2+z})}\right)$ bits, achieving better bounds than previous approaches while upgrading from constant-factor to $(1+\varepsilon)$-approximation guarantees. Our techniques combine new strategies for constant-factor approximation with efficient coreset constructions and compact encoding schemes, leading to optimal protocols that match both the communication costs of the best-known offline coreset constructions and existing lower bounds (Chen et al., NeurIPS 2016, Huang et. al., STOC 2024), up to polylogarithmic factors.
Problem

Research questions and friction points this paper is trying to address.

distributed algorithms
Euclidean clustering
coresets
communication complexity
(k,z)-clustering
Innovation

Methods, ideas, or system contributions that make the work stand out.

distributed clustering
coreset construction
communication complexity
Euclidean (k,z)-clustering
approximation algorithms
🔎 Similar Papers
No similar papers found.