首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We introduce a gradient descent algorithm for solving large scale unconstrained nonlinear optimization problems. The computation of the initial trial steplength is based on the usage of both the quasi-Newton property and the Hessian inverse approximation by an appropriate scalar matrix. The nonmonotone line search technique for the steplength calculation is applied later. The computational and storage complexity of the new method is equal to the computational and storage complexity of the Barzilai and Borwein method. On the other hand, the reported numerical results indicate improvements in favor of the new method with respect to the well known global Barzilai and Borwein method.  相似文献   

2.
In nonlinear problems, the Hasofer–Lind–Rackwitz–Fiessler algorithm of the first order reliability method sometimes is puzzled by its non-convergence. A new Hasofer–Lind–Rackwitz–Fiessler algorithm incorporating Barzilai–Borwein step is investigated in this paper to speed up the rate of convergence and performs in a stable manner. The algorithm is essentially established on the basis of the global Barzilai–Borwein gradient method, which is dealt with two stages. The first stage, implemented by the traditional steepest descent method with specific decayed step sizes, prepares a good initial point for the global Barzilai–Borwein gradient algorithm in the second stage, which takes the merit function as the objective to locate the most probable failure point. The efficiency and convergence of the proposed method and some other reliability analysis methods are presented and discussed in details by several numerical examples. It is found that the proposed method is stable and very efficient in the nonlinear problems except those super nonlinear ones, even more accurate than the descent direction method with step sizes following the fixed exponential decay strategy.  相似文献   

3.
On the Barzilai and Borwein choice of steplength for the gradient method   总被引:16,自引:0,他引:16  
In a recent paper, Barzilai and Borwein presented a new choiceof steplength for the gradient method. Their choice does notguarantee descent in the objective function and greatly speedsup the convergence of the method. They presented a convergenceanalysis of their method only in the two-dimensional quadraticcase. We establish the convergence of the Barzilai and Borweingradient method when applied to the minimization of a strictlyconvex quadratic function of any number of variables.  相似文献   

4.
The Barzilai–Borwein (BB) gradient method has received many studies due to its simplicity and numerical efficiency. By incorporating a nonmonotone line search, Raydan (SIAM J Optim. 1997;7:26–33) has successfully extended the BB gradient method for solving general unconstrained optimization problems so that it is competitive with conjugate gradient methods. However, the numerical results reported by Raydan are poor for very ill-conditioned problems because the effect of the degree of nonmonotonicity may be noticeable. In this paper, we focus more on the nonmonotone line search technique used in the global Barzilai–Borwein (GBB) gradient method. We improve the performance of the GBB gradient method by proposing an adaptive nonmonotone line search based on the morphology of the objective function. We also prove the global convergence and the R-linear convergence rate of the proposed method under reasonable assumptions. Finally, we give some numerical experiments made on a set of unconstrained optimization test problems of the CUTEr collection. The results show the efficiency of the proposed method in the sense of the performance profile introduced (Math Program. 2002;91:201–213) by Dolan and Moré.  相似文献   

5.
Journal of Optimization Theory and Applications - The subspace minimization conjugate gradient methods based on Barzilai–Borwein method (SMCG_BB) and regularization model (SMCG_PR), which...  相似文献   

6.
In this paper, a class of minimization problems over density matrices arising in the quantum state estimation is investigated. By making use of the Nesterov’s accelerated strategies, we introduce a modified augmented Lagrangian method to solve it, where the subproblem is tackled by the projected Barzilai–Borwein method with nonmonotone line search. Compared with the existing projected gradient method, several numerical examples are tested to show that the proposed method is efficient and promising.  相似文献   

7.
The aim of the nuclear norm minimization problem is to find a matrix that minimizes the sum of its singular values and satisfies some constraints simultaneously. Such a problem has received more attention largely because it is closely related to the affine rank minimization problem, which appears in many control applications including controller design, realization theory, and model reduction. In this paper, we first propose an exact version alternating direction method for solving the nuclear norm minimization problem with linear equality constraints. At each iteration, the method involves a singular value thresholding and linear matrix equations which are solved exactly. Convergence of the proposed algorithm is followed directly. To broaden the capacity of solving larger problems, we solve approximately the subproblem by an iterative method with the Barzilai–Borwein steplength. Some extensions to the noisy problems and nuclear norm regularized least‐square problems are also discussed. Numerical experiments and comparisons with the state‐of‐the‐art method FPCA show that the proposed method is effective and promising. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
In this paper, a new type of stepsize, approximate optimal stepsize, for gradient method is introduced to interpret the Barzilai–Borwein (BB) method, and an efficient gradient method with an approximate optimal stepsize for the strictly convex quadratic minimization problem is presented. Based on a multi-step quasi-Newton condition, we construct a new quadratic approximation model to generate an approximate optimal stepsize. We then use the two well-known BB stepsizes to truncate it for improving numerical effects and treat the resulted approximate optimal stepsize as the new stepsize for gradient method. We establish the global convergence and R-linear convergence of the proposed method. Numerical results show that the proposed method outperforms some well-known gradient methods.  相似文献   

9.
In this paper, we present an efficient method for nonnegative matrix factorization based on the alternating nonnegative least squares framework. Our approach adopts a monotone projected Barzilai–Borwein (MPBB) method as an essential subroutine where the step length is determined without line search. The Lipschitz constant of the gradient is exploited to accelerate convergence. Global convergence of the proposed MPBB method is established. Numerical results are reported to demonstrate the efficiency of our algorithm.  相似文献   

10.
For solving large-scale unconstrained minimization problems, the nonlinear conjugate gradient method is welcome due to its simplicity, low storage, efficiency and nice convergence properties. Among all the methods in the framework, the conjugate gradient descent algorithm — CG_DESCENT is very popular, in which the generated directions descend automatically, and this nice property is independent of any line search used. In this paper, we generalize CG_DESCENT with two Barzilai–Borwein steplength reused cyclically. We show that the resulting algorithm owns attractive sufficient descent property and converges globally under some mild conditions. We test the proposed algorithm by using a large set of unconstrained problems with high dimensions in CUTEr library. The numerical comparisons with the state-of-the-art algorithm CG_DESCENT illustrate that the proposed method is effective, competitive, and promising.  相似文献   

11.
We propose a new gradient projection algorithm that compares favorably with the fastest algorithms available to date for 1-constrained sparse recovery from noisy data, both in the compressed sensing and inverse problem frameworks. The method exploits a line-search along the feasible direction and an adaptive steplength selection based on recent strategies for the alternation of the well-known Barzilai–Borwein rules. The convergence of the proposed approach is discussed and a computational study on both well conditioned and ill-conditioned problems is carried out for performance evaluations in comparison with five other algorithms proposed in the literature.  相似文献   

12.
In this paper we view the Barzilai and Borwein (BB) method from a new angle, and present a new adaptive Barzilai and Borwein (NABB) method with a nonmonotone line search for general unconstrained optimization. In the proposed method, the scalar approximation to the Hessian matrix is updated by the Broyden class formula to generate an adaptive stepsize. It is remarkable that the new stepsize is chosen adaptively in the interval which contains the two well-known BB stepsizes. Moreover, for the negative curvature direction, a strategy for the choice of the stepsize is designed to accelerate the convergence rate of the NABB method. Furthermore, we apply the NABB method without any line search to strictly convex quadratic minimization. The numerical experiments show the NABB method is very promising.  相似文献   

13.
We consider the expected residual minimization (ERM) formulation of stochastic linear complementarity problem (SLCP). By employing the Barzilai–Borwein (BB) stepsize and active set strategy, we present a BB type method for solving the ERM problem. The global convergence of the proposed method is proved under mild conditions. Preliminary numerical results show that the method is promising.  相似文献   

14.
《Optimization》2012,61(4-5):395-415
The Barzilai and Borwein (BB) gradient method does not guarantee a descent in the objective function at each iteration, but performs better than the classical steepest descent (SD) method in practice. So far, the BB method has found many successful applications and generalizations in linear systems, unconstrained optimization, convex-constrained optimization, stochastic optimization, etc. In this article, we propose a new gradient method that uses the SD and the BB steps alternately. Hence the name “alternate step (AS) gradient method.” Our theoretical and numerical analyses show that the AS method is a promising alternative to the BB method for linear systems. Unconstrained optimization algorithms related to the AS method are also discussed. Particularly, a more efficient gradient algorithm is provided by exploring the idea of the AS method in the GBB algorithm by Raydan (1997).

To establish a general R-linear convergence result for gradient methods, an important property of the stepsize is drawn in this article. Consequently, R-linear convergence result is established for a large collection of gradient methods, including the AS method. Some interesting insights into gradient methods and discussion about monotonicity and nonmonotonicity are also given.  相似文献   

15.
R-linear convergence of the Barzilai and Borwein gradient method   总被引:4,自引:0,他引:4  
Combined with non-monotone line search, the Barzilai and Borwein(BB) gradient method has been successfully extended for solvingunconstrained optimization problems and is competitive withconjugate gradient methods. In this paper, we establish theR-linear convergence of the BB method for any-dimensional stronglyconvex quadratics. One corollary of this result is that theBB method is also locally R-linear convergent for general objectivefunctions, and hence the stepsize in the BB method will alwaysbe accepted by the non-monotone line search when the iterateis close to the solution.  相似文献   

16.
In this paper, we present an extension of the Barzilai–Borwein method for the minimization of a quadratic form defined in a finite-dimensional real Euclidean space. We present a convergence analysis of the proposed method. We also present some applications in the resolution of linear matrix equations such as the Sylvester equation, which are of great interest in control theory and other engineering disciplines. Our numerical results indicate that the new method competes satisfactorily in the resolution of linear matrix equations with the function lyap of the Toolbox of Control of MATLAB.  相似文献   

17.
In this paper we propose new globalization strategies for the Barzilai and Borwein gradient method, based on suitable relaxations of the monotonicity requirements. In particular, we define a class of algorithms that combine nonmonotone watchdog techniques with nonmonotone linesearch rules and we prove the global convergence of these schemes. Then we perform an extensive computational study, which shows the effectiveness of the proposed approach in the solution of large dimensional unconstrained optimization problems.  相似文献   

18.
The alternating direction method of multipliers (ADMM) has recently received a lot of attention especially due to its capability to harness the power of the new parallel and distributed computing environments. However, ADMM could be notoriously slow especially if the penalty parameter, assigned to the augmented term in the objective function, is not properly chosen. This paper aims to accelerate ADMM by integrating that with the Barzilai–Borwein gradient method and an acceleration technique known as line search. Line search accelerates an iterative method by performing a one-dimensional search along the line segment connecting two successive iterations. We pay a special attention to the large-scale nonnegative least squares problems, and our experiments using real datasets indicate that the integration not only accelerate ADMM but also robustifies that against the penalty parameter.  相似文献   

19.
In this paper, by the use of the residual vector and an approximation to the steepest descent direction of the norm function, we develop a norm descent spectral method for solving symmetric nonlinear equations. The method based on the nomonotone line search techniques is showed to be globally convergent. A specific implementation of the method is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) for unconstrained optimization. Preliminary numerical results indicate that the method is promising.  相似文献   

20.
The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai–Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage of the availability of a few additional ‘long’ vectors of storage to achieve a significant improvement in performance, both for quadratic and non-quadratic objective functions. It makes use of certain Ritz values related to the Lanczos process (Lanczos in J Res Nat Bur Stand 45:255–282, 1950). Some underlying theory is provided, and numerical evidence is set out showing that the new method provides a competitive and more simple alternative to the state of the art l-BFGS limited memory method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号