🤖 AI Summary
This work characterizes the geometric structure of the computationally hard phase in statistical detection problems, establishing a rigorous equivalence between the optimized Franz–Parisi (FP) criterion and Statistical Query (SQ) lower bounds. Methodologically, it extends the FP criterion—previously limited to Gaussian settings—to non-Gaussian models including Non-Gaussian Component Analysis (NGCA), single-index models, convexly truncated detection, and mixtures of sparse linear regressions. This extension integrates statistical physics potential analysis, low-degree polynomial methods, and Gaussian correlation inequalities to systematically model overlap structure. The contributions are threefold: (1) the first proof of exact equivalence between the optimized FP criterion and SQ lower bounds across a broad class of models; (2) a unifying, simplified framework for deriving classical SQ lower bounds; and (3) the first tight SQ lower bounds for several long-standing non-Gaussian inference problems, revealing the geometric essence of computational hardness and yielding a universal hardness criterion.
📝 Abstract
Bandeira et al. (2022) introduced the Franz-Parisi (FP) criterion for characterizing the computational hard phases in statistical detection problems. The FP criterion, based on an annealed version of the celebrated Franz-Parisi potential from statistical physics, was shown to be equivalent to low-degree polynomial (LDP) lower bounds for Gaussian additive models, thereby connecting two distinct approaches to understanding the computational hardness in statistical inference. In this paper, we propose a refined FP criterion that aims to better capture the geometric ``overlap"structure of statistical models. Our main result establishes that this optimized FP criterion is equivalent to Statistical Query (SQ) lower bounds -- another foundational framework in computational complexity of statistical inference. Crucially, this equivalence holds under a mild, verifiable assumption satisfied by a broad class of statistical models, including Gaussian additive models, planted sparse models, as well as non-Gaussian component analysis (NGCA), single-index (SI) models, and convex truncation detection settings. For instance, in the case of convex truncation tasks, the assumption is equivalent with the Gaussian correlation inequality (Royen, 2014) from convex geometry. In addition to the above, our equivalence not only unifies and simplifies the derivation of several known SQ lower bounds -- such as for the NGCA model (Diakonikolas et al., 2017) and the SI model (Damian et al., 2024) -- but also yields new SQ lower bounds of independent interest, including for the computational gaps in mixed sparse linear regression (Arpino et al., 2023) and convex truncation (De et al., 2023).