🤖 AI Summary
This paper addresses the statistical and computational challenges of estimating a high-dimensional parameter $x$ under conic constraints from nonlinear observations $y_i = f_i(langle a_i, x
angle)$, encompassing tensor single-index models, tensor logistic regression, local-noise tensor phase retrieval, and one-bit tensor sensing. We propose a unified analytical framework based on a Restricted Approximate Invertibility Condition (RAIC), derived from the gradient of the observation model, and integrate three algorithmic paradigms: projected gradient descent, Riemannian gradient descent, and factored gradient descent. To our knowledge, this is the first work to construct minimax-optimal and computationally efficient estimators for such tensorized nonlinear models. Under Gaussian design, our algorithms achieve linear convergence guarantees. The theoretical estimation rates attain statistical optimality, and extensive simulations as well as real-data experiments corroborate the theoretical findings.
📝 Abstract
We consider the estimation of some parameter $mathbf{x}$ living in a cone from the nonlinear observations of the form ${y_i=f_i(langlemathbf{a}_i,mathbf{x}
angle)}_{i=1}^m$. We develop a unified approach that first constructs a gradient from the data and then establishes the restricted approximate invertibility condition (RAIC), a condition that quantifies how well the gradient aligns with the ideal descent step. We show that RAIC yields linear convergence guarantees for the standard projected gradient descent algorithm, a Riemannian gradient descent algorithm for low Tucker-rank tensor estimation, and a factorized gradient descent algorithm for asymmetric low-rank matrix estimation. Under Gaussian designs, we establish sharp RAIC for the canonical statistical estimation problems of single index models, generalized linear models, noisy phase retrieval, and one-bit compressed sensing. Combining the convergence guarantees and the RAIC, we obtain a set of optimal statistical estimation results, including, to our knowledge, the first minimax-optimal and computationally efficient algorithms for tensor single index models, tensor logistic regression, (local) noisy tensor phase retrieval, and one-bit tensor sensing. Moreover, several other results are new or match the best known guarantees. We also provide simulations and a real-data experiment to illustrate the theoretical results.