首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
In this article, we present a fast and stable algorithm for solving a class of optimization problems that arise in many statistical estimation procedures, such as sparse fused lasso over a graph, convex clustering, and trend filtering, among others. We propose a so-called augmented alternating direction methods of multipliers (ADMM) algorithm to solve this class of problems. Compared to a standard ADMM algorithm, our proposal significantly reduces the computational cost at each iteration while maintaining roughly the same overall convergence speed. We also consider a new varying penalty scheme for the ADMM algorithm, which could further accelerate the convergence, especially when solving a sequence of problems with tuning parameters of different scales. Extensive numerical experiments on the sparse fused lasso problem show that the proposed algorithm is more efficient than the standard ADMM and two other existing state-of-the-art specialized algorithms. Finally, we discuss a possible extension and some interesting connections to two well-known algorithms. Supplementary materials for the article are available online.  相似文献   

2.
《Optimization》2012,61(10):1729-1743
ABSTRACT

In this note, we consider three types of problems, H-weighted nearest correlation matrix problem and two types of important doubly non-negative semidefinite programming, derived from the binary integer quadratic programming and maximum cut problem. The dual of these three types of problems is a 3-block separable convex optimization problem with a coupling linear equation constraint. It is known that, the directly extended 3-block alternating direction method of multipliers (ADMM3d) is more efficient than many of its variants for solving these convex optimization, but its convergence is not guaranteed. By choosing initial points properly, we obtain the convergence of ADMM3d for solving the dual of these three types of problems. Furthermore, we simplify the iterative scheme of ADMM3d and show the equivalence of ADMM3d to the 2-block semi-proximal ADMM for solving the dual's reformulation, under these initial conditions.  相似文献   

3.
This note serves two purposes. Firstly, we construct a counterexample to show that the statement on the convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex optimization problems in a highly influential paper by Boyd et al. (Found Trends Mach Learn 3(1):1–122, 2011) can be false if no prior condition on the existence of solutions to all the subproblems involved is assumed to hold. Secondly, we present fairly mild conditions to guarantee the existence of solutions to all the subproblems of the ADMM and provide a rigorous convergence analysis on the ADMM with a computationally more attractive large step-length that can even exceed the practically much preferred golden ratio of \((1+\sqrt{5})/2\).  相似文献   

4.
In this paper, we propose an inexact multi-block ADMM-type first-order method for solving a class of high-dimensional convex composite conic optimization problems to moderate accuracy. The design of this method combines an inexact 2-block majorized semi-proximal ADMM and the recent advances in the inexact symmetric Gauss–Seidel (sGS) technique for solving a multi-block convex composite quadratic programming whose objective contains a nonsmooth term involving only the first block-variable. One distinctive feature of our proposed method (the sGS-imsPADMM) is that it only needs one cycle of an inexact sGS method, instead of an unknown number of cycles, to solve each of the subproblems involved. With some simple and implementable error tolerance criteria, the cost for solving the subproblems can be greatly reduced, and many steps in the forward sweep of each sGS cycle can often be skipped, which further contributes to the efficiency of the proposed method. Global convergence as well as the iteration complexity in the non-ergodic sense is established. Preliminary numerical experiments on some high-dimensional linear and convex quadratic SDP problems with a large number of linear equality and inequality constraints are also provided. The results show that for the vast majority of the tested problems, the sGS-imsPADMM is 2–3 times faster than the directly extended multi-block ADMM with the aggressive step-length of 1.618, which is currently the benchmark among first-order methods for solving multi-block linear and quadratic SDP problems though its convergence is not guaranteed.  相似文献   

5.
During the last decade, the state-of-the-art alternating direction method of multipliers (ADMM) has successfully been used to solve many two-block separable convex minimization problems arising from several applied areas such as signal/image processing and statistical and machine learning. It however remains an interesting problem of how to implement ADMM to three-block separable convex minimization problems as required by the situation where many objective functions in the above-mentioned areas are actually more conveniently decomposed to the sum of three convex functions, due also to the observation that the straightforward extension of ADMM from the two-block case to the three-block case is apparently not convergent. In this paper, we shall introduce a new algorithm that is called a partially isochronous splitting algorithm (PISA) in order to implement ADMM for the three-block separable model. The main idea of our algorithm is to incorporate only one proximal term into the last subproblem of the extended ADMM so that the resulting algorithm maximally inherits the promising properties of ADMM. A remarkable superiority over the extended ADMM is that we can simultaneously solve two of the subproblems, thereby taking advantages of the separable structure and parallel architectures. Theoretically, we will establish the global convergence of our algorithm under standard conditions, and also the O(1/t) rate of convergence in both ergodic and nonergodic senses, where t is the iteration counter. The computational competitiveness of our algorithm is shown by numerical experiments on an application to the well-tested robust principal component analysis model.  相似文献   

6.
We propose an alternating direction method of multipliers (ADMM) for solving the state constrained optimization problems governed by elliptic equations. The unconstrained as well as box-constrained cases of the Dirichlet boundary control, Robin boundary control, and right-hand side control problems are considered here. These continuous optimization problems are transformed into discrete optimization problems by the finite element method discretization, then are solved by ADMM. The ADMM is an efficient first order algorithm with global convergence, which combines the decomposability of dual ascent with the superior convergence properties of the method of multipliers. We shall present exhaustive convergence analysis of ADMM for these different type optimization problems. The numerical experiments are performed to verify the efficiency of the method.  相似文献   

7.
The alternating direction method of multipliers(ADMM)is a benchmark for solving convex programming problems with separable objective functions and linear constraints.In the literature it has been illustrated as an application of the proximal point algorithm(PPA)to the dual problem of the model under consideration.This paper shows that ADMM can also be regarded as an application of PPA to the primal model with a customized choice of the proximal parameter.This primal illustration of ADMM is thus complemental to its dual illustration in the literature.This PPA revisit on ADMM from the primal perspective also enables us to recover the generalized ADMM proposed by Eckstein and Bertsekas easily.A worst-case O(1/t)convergence rate in ergodic sense is established for a slight extension of Eckstein and Bertsekas’s generalized ADMM.  相似文献   

8.
In this paper, we propose a generalized alternating direction method of multipliers (ADMM) with semi-proximal terms for solving a class of convex composite conic optimization problems, of which some are high-dimensional, to moderate accuracy. Our primary motivation is that this method, together with properly chosen semi-proximal terms, such as those generated by the recent advance of block symmetric Gauss–Seidel technique, is capable of tackling these problems. Moreover, the proposed method, which relaxes both the primal and the dual variables in a natural way with a common relaxation factor in the interval of (0, 2), has the potential of enhancing the performance of the classic ADMM. Extensive numerical experiments on various doubly non-negative semidefinite programming problems, with or without inequality constraints, are conducted. The corresponding results showed that all these multi-block problems can be successively solved, and the advantage of using the relaxation step is apparent.  相似文献   

9.
The alternating direction method of multipliers (ADMM) has recently received a lot of attention especially due to its capability to harness the power of the new parallel and distributed computing environments. However, ADMM could be notoriously slow especially if the penalty parameter, assigned to the augmented term in the objective function, is not properly chosen. This paper aims to accelerate ADMM by integrating that with the Barzilai–Borwein gradient method and an acceleration technique known as line search. Line search accelerates an iterative method by performing a one-dimensional search along the line segment connecting two successive iterations. We pay a special attention to the large-scale nonnegative least squares problems, and our experiments using real datasets indicate that the integration not only accelerate ADMM but also robustifies that against the penalty parameter.  相似文献   

10.
徐薇  吴钰炜  陈彩华 《计算数学》2018,40(4):436-449
企业的商品流通配送问题是典型的线性多商品流问题.由于经营规模的扩大和全球化运营模式的推行,企业所面临的问题规模正变得空前巨大,数据存储也越来越分散,传统方法已无法适应求解需求.本文基于交替方向乘子法(ADMM)的可分解性,提出一类随机ADMM算法,将大规模的问题分解成多个、规模比较小的问题,并采取随机顺序去求解这些小问题以及对偶问题,最终得到原问题的最优解.算法克服了ADMM的直接拓展求解多块问题时可能发散的缺点,并采用MnetGen生成器随机生成的多个规模不同的线性多商品流问题对算法进行了测试,验证了算法的有效性和高效的求解效率.  相似文献   

11.
The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where \(p \ge 1\) and \(q \ge 1\) are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case \({\mathcal {O}}(1/t)\) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.  相似文献   

12.
Structure-enforced matrix factorization (SeMF) represents a large class of mathematical models appearing in various forms of principal component analysis, sparse coding, dictionary learning and other machine learning techniques useful in many applications including neuroscience and signal processing. In this paper, we present a unified algorithm framework, based on the classic alternating direction method of multipliers (ADMM), for solving a wide range of SeMF problems whose constraint sets permit low-complexity projections. We propose a strategy to adaptively adjust the penalty parameters which is the key to achieving good performance for ADMM. We conduct extensive numerical experiments to compare the proposed algorithm with a number of state-of-the-art special-purpose algorithms on test problems including dictionary learning for sparse representation and sparse nonnegative matrix factorization. Results show that our unified SeMF algorithm can solve different types of factorization problems as reliably and as efficiently as special-purpose algorithms. In particular, our SeMF algorithm provides the ability to explicitly enforce various combinatorial sparsity patterns that, to our knowledge, has not been considered in existing approaches.  相似文献   

13.
Based on the alternating direction method of multipliers (ADMM), we develop three numerical algorithms incrementally for solving the optimal control problems constrained by random Helmholtz equations. First, we apply the standard Monte Carlo technique and finite element method for the random and spatial discretization, respectively, and then ADMM is used to solve the resulting system. Next, combining the multi-modes expansion, Monte Carlo technique, finite element method, and ADMM, we propose the second algorithm. In the third algorithm, we preprocess certain quantities before the ADMM iteration, so that nearly no random variable is in the inner iteration. This algorithm is the most efficient one and is easy to implement. The error estimates of these three algorithms are established. The numerical experiments verify the efficiency of our algorithms.  相似文献   

14.
Recovering low-rank and sparse matrix from a given matrix arises in many applications, such as image processing, video background substraction, and so on. The 3-block alternating direction method of multipliers (ADMM) has been applied successfully to solve convex problems with 3-block variables. However, the existing sufficient conditions to guarantee the convergence of the 3-block ADMM usually require the penalty parameter $\gamma$ to satisfy a certain bound, which may affect the performance of solving the large scale problem in practice. In this paper, we propose the 3-block ADMM to recover low-rank and sparse matrix from noisy observations. In theory, we prove that the 3-block ADMM is convergent when the penalty parameters satisfy a certain condition and the objective function value sequences generated by 3-block ADMM converge to the optimal value. Numerical experiments verify that proposed method can achieve higher performance than existing methods in terms of both efficiency and accuracy.  相似文献   

15.
The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective function has more than two functions. Recently, some authors have actively studied the strong convexity condition on the objective function to sufficiently ensure the convergence of the direct extension of ADMM or the resulting convergence when the original scheme is appropriately twisted. We focus on the three-block case of such a model whose objective function is the sum of three functions, and discuss the convergence of the direct extension of ADMM. We show that when one function in the objective is strongly convex, the penalty parameter and the operators in the linear equality constraint are appropriately restricted, it is sufficient to guarantee the convergence of the direct extension of ADMM. We further estimate the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses, and derive the globally linear convergence in asymptotical sense under some additional conditions.  相似文献   

16.
稀疏线性规划在金融计算、工业生产、装配调度等领域应用十分广泛.本文首先给出稀疏线性规划问题的一般模型并证明问题是NP困难问题;其次采用交替方向乘子法(ADMM)求解该问题;最后证明了算法在近似问题上的收敛性.数值实验表明,算法在大规模数值算例上的表现优于已有的混合遗传算法;同时通过对金融实例的计算验证了算法及模型在稀疏投资组合问题上的有效性.  相似文献   

17.
We analyze the convergence rate of the alternating direction method of multipliers (ADMM) for minimizing the sum of two or more nonsmooth convex separable functions subject to linear constraints. Previous analysis of the ADMM typically assumes that the objective function is the sum of only two convex functions defined on two separable blocks of variables even though the algorithm works well in numerical experiments for three or more blocks. Moreover, there has been no rate of convergence analysis for the ADMM without strong convexity in the objective function. In this paper we establish the global R-linear convergence of the ADMM for minimizing the sum of any number of convex separable functions, assuming that a certain error bound condition holds true and the dual stepsize is sufficiently small. Such an error bound condition is satisfied for example when the feasible set is a compact polyhedron and the objective function consists of a smooth strictly convex function composed with a linear mapping, and a nonsmooth \(\ell _1\) regularizer. This result implies the linear convergence of the ADMM for contemporary applications such as LASSO without assuming strong convexity of the objective function.  相似文献   

18.
Clustering is a fundamental problem in many scientific applications. Standard methods such as k-means, Gaussian mixture models, and hierarchical clustering, however, are beset by local minima, which are sometimes drastically suboptimal. Recently introduced convex relaxations of k-means and hierarchical clustering shrink cluster centroids toward one another and ensure a unique global minimizer. In this work, we present two splitting methods for solving the convex clustering problem. The first is an instance of the alternating direction method of multipliers (ADMM); the second is an instance of the alternating minimization algorithm (AMA). In contrast to previously considered algorithms, our ADMM and AMA formulations provide simple and unified frameworks for solving the convex clustering problem under the previously studied norms and open the door to potentially novel norms. We demonstrate the performance of our algorithm on both simulated and real data examples. While the differences between the two algorithms appear to be minor on the surface, complexity analysis and numerical experiments show AMA to be significantly more efficient. This article has supplementary materials available online.  相似文献   

19.
Inverse variational inequalities have broad applications in various disciplines, and some of them have very appealing structures. There are several algorithms (e.g., proximal point algorithms and projection-type algorithms) for solving the inverse variational inequalities in general settings, while few of them have fully exploited the special structures. In this paper, we consider a class of inverse variational inequalities that has a separable structure and linear constraints, which has its root in spatial economic equilibrium problems. To design an efficient algorithm, we develop an alternating direction method of multipliers (ADMM) based method by utilizing the separable structure. Under some mild assumptions, we prove its global convergence. We propose an improved variant that makes the subproblems much easier and derive the convergence result under the same conditions. Finally, we present the preliminary numerical results to show the capability and efficiency of the proposed methods.  相似文献   

20.
This paper introduces an alternating direction method of multipliers (ADMM) for finding solutions to a class of Sylvester matrix equation AXB = E subject to a linear matrix inequality constraint CXDG. Preliminary convergence properties of ADMM are presented. Numerical experiments are performed to illustrate the feasibility and effectiveness of ADMM. In addition, some numerical comparisons with a recent algorithm are also given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号