Generalized symmetric ADMM for separable convex optimization |
| |
Authors: | Jianchao Bai Jicheng Li Fengmin Xu Hongchao Zhang |
| |
Affiliation: | 1.School of Mathematics and Statistics,Xi’an Jiaotong University,Xi’an,People’s Republic of China;2.School of Economics and Finance,Xi’an Jiaotong University,Xi’an,People’s Republic of China;3.Department of Mathematics,Louisiana State University,Baton Rouge,USA |
| |
Abstract: | The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where (p ge 1) and (q ge 1) are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case ({mathcal {O}}(1/t)) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|