A block coordinate gradient descent method for regularized convex separable optimization and covariance selection |
| |
Authors: | Sangwoon Yun Paul Tseng Kim-Chuan Toh |
| |
Affiliation: | 1.Korea Institute for Advanced Study,Seoul,Korea;2.Department of Mathematics,University of Washington,Seattle,USA;3.Department of Mathematics,National University of Singapore,Singapore,Singapore |
| |
Abstract: | We consider a class of unconstrained nonsmooth convex optimization problems, in which the objective function is the sum of a convex smooth function on an open subset of matrices and a separable convex function on a set of matrices. This problem includes the covariance selection problem that can be expressed as an ℓ 1-penalized maximum likelihood estimation problem. In this paper, we propose a block coordinate gradient descent method (abbreviated as BCGD) for solving this class of nonsmooth separable problems with the coordinate block chosen by a Gauss-Seidel rule. The method is simple, highly parallelizable, and suited for large-scale problems. We establish global convergence and, under a local Lipschizian error bound assumption, linear rate of convergence for this method. For the covariance selection problem, the method can terminate in O(n3/e){O(n^3/epsilon)} iterations with an e{epsilon}-optimal solution. We compare the performance of the BCGD method with the first-order methods proposed by Lu (SIAM J Optim 19:1807–1827, 2009; SIAM J Matrix Anal Appl 31:2000–2016, 2010) for solving the covariance selection problem on randomly generated instances. Our numerical experience suggests that the BCGD method can be efficient for large-scale covariance selection problems with constraints. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|