MM Algorithms for Variance Components Models |
| |
Authors: | Hua Zhou Liuyi Hu Jin Zhou Kenneth Lange |
| |
Affiliation: | 1. Department of Biostatistics, University of California, Los Angeles, CA;2. Department of Statistics, North Carolina State University, Raleigh, NC;3. Division of Epidemiology and Biostatistcis, University of Arizona, Tuscon, AZ;4. Department of Human Genetics, University of California, Los Angeles, CA |
| |
Abstract: | Variance components estimation and mixed model analysis are central themes in statistics with applications in numerous scientific disciplines. Despite the best efforts of generations of statisticians and numerical analysts, maximum likelihood estimation (MLE) and restricted MLE of variance component models remain numerically challenging. Building on the minorization–maximization (MM) principle, this article presents a novel iterative algorithm for variance components estimation. Our MM algorithm is trivial to implement and competitive on large data problems. The algorithm readily extends to more complicated problems such as linear mixed models, multivariate response models possibly with missing data, maximum a posteriori estimation, and penalized estimation. We establish the global convergence of the MM algorithm to a Karush–Kuhn–Tucker point and demonstrate, both numerically and theoretically, that it converges faster than the classical EM algorithm when the number of variance components is greater than two and all covariance matrices are positive definite. Supplementary materials for this article are available online. |
| |
Keywords: | Global convergence Linear mixed model (LMM) Matrix convexity Maximum a posteriori (MAP) estimation Minorization–maximization (MM) Multivariate response Penalized estimation Variance components model |
|
|