Algorithmic Universality, Low-Degree Polynomials, and Max-Cut in Sparse Random Graphs

📅 2024-12-23
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the universality of algorithms for variational problems—such as Max-Cut—on sparse random graphs: whether algorithms exhibit identical asymptotic behavior under different input distributions sharing the same low-order moment conditions. Method: Focusing on low-degree polynomial (LDP) algorithms, the authors establish a rigorous performance equivalence between the sparse Erdős–Rényi model $G(n,d/n)$ and the Sherrington–Kirkpatrick (SK) spin glass model, leveraging tools from random graph theory, spin glass analysis, and discrete approximation error control. Contribution/Results: The paper provides the first proof that, after appropriate rescaling, the Max-Cut values produced by LDP algorithms converge to the same limit in both models, with vanishing error. It further demonstrates that the discretized output of approximate message passing (AMP) on sparse graphs asymptotically approximates the binary hypercube—a key step toward bridging continuous AMP analysis with discrete combinatorial optimization. These results lay a new theoretical foundation for moment-based algorithm design in random combinatorial optimization.

Technology Category

Application Category

📝 Abstract
Universality, namely distributional invariance, is a well-known property for many random structures. For example it is known to hold for a broad range of variational problems with random input. Much less is known about the universality of the performance of specific algorithms for solving such variational problems. Namely, do algorithms tuned to specific variational tasks produce the same asymptotic answer regardless of the underlying distribution? In this paper we show that the answer is yes for a class of models, which includes spin glass models and constraint satisfaction problems on sparse graphs, provided that an algorithm can be coded as a low-degree polynomial (LDP). We illustrate this specifically for the case of the Max-Cut problem in sparse Erd""os-R'enyi graph $mathbb{G}(n, d/n)$. We use the fact that the Approximate Message Passing (AMP) algorithm, which is an effective algorithm for finding near-ground state of the Sherrington-Kirkpatrick (SK) model, is well approximated by an LDP. We then establish our main universality result: the performance of the LDP based algorithms exhibiting certain connectivity property, is the same in the mean-field (SK) and in the random graph $mathbb{G}(n, d/n)$ setting, up to an appropriate rescaling. The main technical challenge which we address in this paper is showing that the output of the LDP algorithm on $mathbb{G}(n, d/n)$ is truly discrete, namely it is close to the set of points in the binary cube.
Problem

Research questions and friction points this paper is trying to address.

Establishes algorithmic universality for low-degree polynomial methods
Compares performance of algorithms on mean-field and sparse graph models
Ensures discrete outputs for algorithms on sparse random graphs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Algorithmic universality for low-degree polynomial algorithms
Approximate Message Passing approximated by low-degree polynomials
Universality of coordinate-wise statistics ensures discrete output
🔎 Similar Papers
No similar papers found.