π€ AI Summary
This work addresses the challenge of balancing computational efficiency and estimation accuracy in high-dimensional covariance and precision matrix estimation by proposing a learnable reparameterized linearized alternating direction method of multipliers (LADMM) framework. The approach integrates neural networks into the optimization pipeline to data-adaptively model proximal operators and leverages reparameterization techniques to accelerate convergence. Theoretical analysis establishes rigorous guarantees on the algorithmβs convergence, monotonicity, and convergence rate. Empirical evaluations demonstrate that the proposed method significantly outperforms classical optimization algorithms across various high-dimensional structural settings, achieving both higher estimation accuracy and faster convergence.
π Abstract
Efficient estimation of high-dimensional matrices-including covariance and precision matrices-is a cornerstone of modern multivariate statistics. Most existing studies have focused primarily on the theoretical properties of the estimators (e.g., consistency and sparsity), while largely overlooking the computational challenges inherent in high-dimensional settings. Motivated by recent advances in learning-based optimization method-which integrate data-driven structures with classical optimization algorithms-we explore high-dimensional matrix estimation assisted by machine learning. Specifically, for the optimization problem of high-dimensional matrix estimation, we first present a solution procedure based on the Linearized Alternating Direction Method of Multipliers (LADMM). We then introduce learnable parameters and model the proximal operators in the iterative scheme with neural networks, thereby improving estimation accuracy and accelerating convergence. Theoretically, we first prove the convergence of LADMM, and then establish the convergence, convergence rate, and monotonicity of its reparameterized counterpart; importantly, we show that the reparameterized LADMM enjoys a faster convergence rate. Notably, the proposed reparameterization theory and methodology are applicable to the estimation of both high-dimensional covariance and precision matrices. We validate the effectiveness of our method by comparing it with several classical optimization algorithms across different structures and dimensions of high-dimensional matrices.