🤖 AI Summary
Structural model estimation via generalized method of moments (GMM) or simulated method of moments (SMM) often suffers from non-convex objective functions, undermining global convergence guarantees. To overcome convexity dependence, this paper introduces a novel theoretical framework grounded in a global Jacobian rank condition—strictly weaker than convexity—and provides the first rigorous proof of global convergence for gradient descent and Gauss–Newton methods under smooth moment models. The condition ensures robustness to nonlinear reparameterizations and moderate model misspecification, while also revealing potential failure modes of Newton-type methods. Our analysis integrates local smoothness and stability characterizations. Numerical experiments on random-coefficient demand estimation and impulse response matching validate both empirical efficacy and strong convergence guarantees, significantly enhancing reliability and practicality in estimating parameters of complex economic models.
📝 Abstract
Generalized and Simulated Method of Moments are often used to estimate structural Economic models. Yet, it is commonly reported that optimization is challenging because the corresponding objective function is non-convex. For smooth problems, this paper shows that convexity is not required: under a global rank condition involving the Jacobian of the sample moments, certain algorithms are globally convergent. These include a gradient-descent and a Gauss-Newton algorithm with appropriate choice of tuning parameters. The results are robust to 1) non-convexity, 2) one-to-one non-linear reparameterizations, and 3) moderate misspecification. In contrast, Newton-Raphson and quasi-Newton methods can fail to converge for the same estimation because of non-convexity. A simple example illustrates a non-convex GMM estimation problem that satisfies the aforementioned rank condition. Empirical applications to random coefficient demand estimation and impulse response matching further illustrate the results.