Metric Embeddings Beyond Bi-Lipschitz Distortion via Sherali-Adams

📅 2023-11-29
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the bottleneck in metric embedding via multidimensional scaling (MDS), where existing approximation algorithms suffer from exponential dependence on the input scale Δ, leading to prohibitive time complexity. We present the first approximation algorithm with *quasi-polynomial* dependence on Δ. In constant-dimensional Euclidean space, it achieves an $ O(log Delta) cdot mathrm{OPT}^{Omega(1)} + varepsilon $ approximation in $ n^{O(1)} cdot 2^{mathrm{poly}((log Delta)/varepsilon)} $ time—exponentially faster than prior $ 2^{mathrm{poly}(Delta/varepsilon)} $-time algorithms. Our key methodological innovation is the first integration of the Sherali–Adams linear programming hierarchy with geometry-aware conditional rounding, enabling a principled bypass of the exponential Δ-dependence inherent in classical approaches. This yields a new theoretical paradigm for non-doubling-Lipschitz embeddings, advancing both algorithmic design and geometric embedding theory.
📝 Abstract
Metric embeddings are a widely used method in algorithm design, where generally a ``complex'' metric is embedded into a simpler, lower-dimensional one. Historically, the theoretical computer science community has focused on bi-Lipschitz embeddings, which guarantee that every pairwise distance is approximately preserved. In contrast, alternative embedding objectives that are commonly used in practice avoid bi-Lipschitz distortion; yet these approaches have received comparatively less study in theory. In this paper, we focus on Multi-dimensional Scaling (MDS), where we are given a set of non-negative dissimilarities ${d_{i,j}}_{i,jin [n]}$ over $n$ points, and the goal is to find an embedding ${x_1,dots,x_n} subset R^k$ that minimizes $$ extrm{OPT}=min_{x}mathbb{E}_{i,jin [n]}left(1-frac{|x_i - x_j|}{d_{i,j}} ight)^2.$$ Despite its popularity, our theoretical understanding of MDS is extremely limited. Recently, Demaine et. al. (arXiv:2109.11505) gave the first approximation algorithm with provable guarantees for this objective, which achieves an embedding in constant dimensional Euclidean space with cost $ extrm{OPT} +epsilon$ in $n^2cdot 2^{ extrm{poly}(Delta/epsilon)}$ time, where $Delta$ is the aspect ratio of the input dissimilarities. For metrics that admit low-cost embeddings, $Delta$ scales polynomially in $n$. In this work, we give the first approximation algorithm for MDS with quasi-polynomial dependency on $Delta$: for constant dimensional Euclidean space, we achieve a solution with cost $O(log Delta)cdot extrm{OPT}^{Omega(1)}+epsilon$ in time $n^{O(1)} cdot 2^{ ext{poly}((log(Delta)/epsilon))}$. Our algorithms are based on a novel geometry-aware analysis of a conditional rounding of the Sherali-Adams LP Hierarchy, allowing us to avoid exponential dependency on the aspect ratio, which would typically result from this rounding.
Problem

Research questions and friction points this paper is trying to address.

Improving approximation algorithm for Multi-dimensional Scaling
Reducing quasi-polynomial dependency on aspect ratio
Avoiding exponential aspect ratio dependency via Sherali-Adams
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sherali-Adams LP Hierarchy with conditional rounding
Geometry-aware analysis avoiding exponential dependency
Quasi-polynomial time approximation for MDS
🔎 Similar Papers
No similar papers found.