🤖 AI Summary
This work addresses generalized iterative algorithms for high-dimensional nonconvex optimization featuring mixed first-order and saddle-point updates. We propose the first rigorous State Evolution (SE) analysis framework applicable to non-coordinate-separable structures. Methodologically, we introduce a Hilbert-space parameterization model, integrating Bolthausen’s conditioning technique with the sequential form of Gordon’s Gaussian comparison inequality—thereby overcoming classical SE limitations restricted to purely first-order or separable settings. Theoretically, we establish the first rigorous SE trajectory derivation under nonseparable, finite-sample conditions and derive an explicit upper bound on the deviation between empirical iterates and the theoretical SE path. This framework provides a unified, verifiable, and precise performance guarantee for a broad class of complex optimizers, including gradient methods augmented with saddle-point corrections.
📝 Abstract
We develop a toolbox for exact analysis of iterative algorithms on a class of high-dimensional nonconvex optimization problems with random data. While prior work has shown that low-dimensional statistics of (generalized) first-order methods can be predicted by a deterministic recursion known as state evolution, our focus is on developing such a prediction for a more general class of algorithms. We provide a state evolution for any method whose iterations are given by (possibly interleaved) first-order and saddle point updates, showing two main results. First, we establish a rigorous state evolution prediction that holds even when the updates are not coordinate-wise separable. Second, we establish finite-sample guarantees bounding the deviation of the empirical updates from the established state evolution. In the process, we develop a technical toolkit that may prove useful in related problems. One component of this toolkit is a general Hilbert space lifting technique to prove existence and uniqueness of a convenient parameterization of the state evolution. Another component of the toolkit combines a generic application of Bolthausen's conditioning method with a sequential variant of Gordon's Gaussian comparison inequality, and provides additional ingredients that enable a general finite-sample analysis.