🤖 AI Summary
This work studies the sign balancing problem for sequences of Gaussian symmetric matrices: given $n$ independent $d imes d$ standard Gaussian symmetric matrices, does there exist a $pm 1$ signing such that the operator norm of their signed sum is bounded by a prescribed threshold? This is the average-case random variant of the Matrix Spencer conjecture and closely related to the symmetric binary perceptron. The authors rigorously characterize the satisfiability phase transition: when $n = kappa d^2$, there exist critical thresholds $ au_1(kappa)$ (the exponential threshold for existence of nontrivial solutions) and $ au_2(kappa)$ (the high-probability solvability threshold). They prove spectral macroscopic shrinkage—i.e., the operator norm falls significantly below the semicircle-law prediction—when $n = Theta(d^2)$. Their analysis combines the second-moment method, large deviations and concentration inequalities for the spectral norm of Gaussian matrices, and Altschuler’s recent analysis of perceptron boundaries. Notably, they show the second-moment method fails in the regime $ au_1 < au < au_2$.
📝 Abstract
Given a sequence of $d imes d$ symmetric matrices ${mathbf{W}_i}_{i=1}^n$, and a margin $Delta>0$, we investigate whether it is possible to find signs $(epsilon_1, dots, epsilon_n) in {pm 1}^n$ such that the operator norm of the signed sum satisfies $|sum_{i=1}^n epsilon_i mathbf{W}_i|_{
m op} leq Delta$. Kunisky and Zhang (2023) recently introduced a random version of this problem, where the matrices ${mathbf{W}_i}_{i=1}^n$ are drawn from the Gaussian orthogonal ensemble. This model can be seen as a random variant of the celebrated Matrix Spencer conjecture and as a matrix-valued analog of the symmetric binary perceptron in statistical physics. In this work, we establish a satisfiability transition in this problem as $n, d o infty$ with $n / d^2 o au>0$. First, we prove that the expected number of solutions with margin $Delta=kappa sqrt{n}$ has a sharp threshold at a critical $ au_1(kappa)$: for $ au< au_1(kappa)$ the problem is typically unsatisfiable, while for $ au> au_1(kappa)$ the average number of solutions is exponentially large. Second, combining a second-moment method with recent results from Altschuler (2023) on margin concentration in perceptron-type problems, we identify a second threshold $ au_2(kappa)$, such that for $ au> au_2(kappa)$ the problem admits solutions with high probability. In particular, we establish that a system of $n = Theta(d^2)$ Gaussian random matrices can be balanced so that the spectrum of the resulting matrix macroscopically shrinks compared to the semicircle law. Finally, under a technical assumption, we show that there exists values of $( au,kappa)$ for which the number of solutions has large variance, implying the failure of the second moment method. Our proofs rely on establishing concentration and large deviation properties of correlated Gaussian matrices under spectral norm constraints.