🤖 AI Summary
This paper addresses the efficient fitting of multilevel factor models whose covariance matrices exhibit a multilevel low-rank (MLR) structure. We propose the first EM algorithm specifically designed for MLR-structured covariances. Theoretically, we establish— for the first time—that the inverse of an invertible positive semidefinite MLR matrix retains the same sparse MLR structure, and we derive an analytical inverse via a recursive application of the Sherman–Morrison–Woodbury identity. Algorithmically, we devise linear-time Cholesky decomposition and Schur complement update procedures tailored to MLR structure. The resulting EM algorithm achieves *O*(*n*) time and space complexity per iteration, supports arbitrary numbers of hierarchical levels, and preserves numerical stability. An open-source software package implementing the method is publicly available. Experiments demonstrate substantial improvements in scalability and practicality for high-dimensional multilevel factor modeling.
📝 Abstract
We examine a special case of the multilevel factor model, with covariance given by multilevel low rank (MLR) matrix~cite{parshakova2023factor}. We develop a novel, fast implementation of the expectation-maximization (EM) algorithm, tailored for multilevel factor models, to maximize the likelihood of the observed data. This method accommodates any hierarchical structure and maintains linear time and storage complexities per iteration. This is achieved through a new efficient technique for computing the inverse of the positive definite MLR matrix. We show that the inverse of an invertible PSD MLR matrix is also an MLR matrix with the same sparsity in factors, and we use the recursive Sherman-Morrison-Woodbury matrix identity to obtain the factors of the inverse. Additionally, we present an algorithm that computes the Cholesky factorization of an expanded matrix with linear time and space complexities, yielding the covariance matrix as its Schur complement. This paper is accompanied by an open-source package that implements the proposed methods.