首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
R-linear convergence of the Barzilai and Borwein gradient method   总被引:4,自引:0,他引:4  
Combined with non-monotone line search, the Barzilai and Borwein(BB) gradient method has been successfully extended for solvingunconstrained optimization problems and is competitive withconjugate gradient methods. In this paper, we establish theR-linear convergence of the BB method for any-dimensional stronglyconvex quadratics. One corollary of this result is that theBB method is also locally R-linear convergent for general objectivefunctions, and hence the stepsize in the BB method will alwaysbe accepted by the non-monotone line search when the iterateis close to the solution.  相似文献   

2.
The trust region(TR) method for optimization is a class of effective methods.The conic model can be regarded as a generalized quadratic model and it possesses the good convergence properties of the quadratic model near the minimizer.The Barzilai and Borwein(BB) gradient method is also an effective method,it can be used for solving large scale optimization problems to avoid the expensive computation and storage of matrices.In addition,the BB stepsize is easy to determine without large computational efforts.In this paper,based on the conic trust region framework,we employ the generalized BB stepsize,and propose a new nonmonotone adaptive trust region method based on simple conic model for large scale unconstrained optimization.Unlike traditional conic model,the Hessian approximation is an scalar matrix based on the generalized BB stepsize,which resulting a simple conic model.By adding the nonmonotone technique and adaptive technique to the simple conic model,the new method needs less storage location and converges faster.The global convergence of the algorithm is established under certain conditions.Numerical results indicate that the new method is effective and attractive for large scale unconstrained optimization problems.  相似文献   

3.
In this paper, a new type of stepsize, approximate optimal stepsize, for gradient method is introduced to interpret the Barzilai–Borwein (BB) method, and an efficient gradient method with an approximate optimal stepsize for the strictly convex quadratic minimization problem is presented. Based on a multi-step quasi-Newton condition, we construct a new quadratic approximation model to generate an approximate optimal stepsize. We then use the two well-known BB stepsizes to truncate it for improving numerical effects and treat the resulted approximate optimal stepsize as the new stepsize for gradient method. We establish the global convergence and R-linear convergence of the proposed method. Numerical results show that the proposed method outperforms some well-known gradient methods.  相似文献   

4.
The Barzilai–Borwein (BB) gradient method has received many studies due to its simplicity and numerical efficiency. By incorporating a nonmonotone line search, Raydan (SIAM J Optim. 1997;7:26–33) has successfully extended the BB gradient method for solving general unconstrained optimization problems so that it is competitive with conjugate gradient methods. However, the numerical results reported by Raydan are poor for very ill-conditioned problems because the effect of the degree of nonmonotonicity may be noticeable. In this paper, we focus more on the nonmonotone line search technique used in the global Barzilai–Borwein (GBB) gradient method. We improve the performance of the GBB gradient method by proposing an adaptive nonmonotone line search based on the morphology of the objective function. We also prove the global convergence and the R-linear convergence rate of the proposed method under reasonable assumptions. Finally, we give some numerical experiments made on a set of unconstrained optimization test problems of the CUTEr collection. The results show the efficiency of the proposed method in the sense of the performance profile introduced (Math Program. 2002;91:201–213) by Dolan and Moré.  相似文献   

5.
We consider the expected residual minimization (ERM) formulation of stochastic linear complementarity problem (SLCP). By employing the Barzilai–Borwein (BB) stepsize and active set strategy, we present a BB type method for solving the ERM problem. The global convergence of the proposed method is proved under mild conditions. Preliminary numerical results show that the method is promising.  相似文献   

6.
We propose a new monotone algorithm for unconstrained optimization in the frame of Barzilai and Borwein (BB) method and analyze the convergence properties of this new descent method. Motivated by the fact that BB method does not guarantee descent in the objective function at each iteration, but performs better than the steepest descent method, we therefore attempt to find stepsize formula which enables us to approximate the Hessian based on the Quasi-Cauchy equation and possess monotone property in each iteration. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the BB method.  相似文献   

7.
《Optimization》2012,61(4-5):395-415
The Barzilai and Borwein (BB) gradient method does not guarantee a descent in the objective function at each iteration, but performs better than the classical steepest descent (SD) method in practice. So far, the BB method has found many successful applications and generalizations in linear systems, unconstrained optimization, convex-constrained optimization, stochastic optimization, etc. In this article, we propose a new gradient method that uses the SD and the BB steps alternately. Hence the name “alternate step (AS) gradient method.” Our theoretical and numerical analyses show that the AS method is a promising alternative to the BB method for linear systems. Unconstrained optimization algorithms related to the AS method are also discussed. Particularly, a more efficient gradient algorithm is provided by exploring the idea of the AS method in the GBB algorithm by Raydan (1997).

To establish a general R-linear convergence result for gradient methods, an important property of the stepsize is drawn in this article. Consequently, R-linear convergence result is established for a large collection of gradient methods, including the AS method. Some interesting insights into gradient methods and discussion about monotonicity and nonmonotonicity are also given.  相似文献   

8.
We introduce a gradient descent algorithm for solving large scale unconstrained nonlinear optimization problems. The computation of the initial trial steplength is based on the usage of both the quasi-Newton property and the Hessian inverse approximation by an appropriate scalar matrix. The nonmonotone line search technique for the steplength calculation is applied later. The computational and storage complexity of the new method is equal to the computational and storage complexity of the Barzilai and Borwein method. On the other hand, the reported numerical results indicate improvements in favor of the new method with respect to the well known global Barzilai and Borwein method.  相似文献   

9.
A NEW STEPSIZE FOR THE STEEPEST DESCENT METHOD   总被引:8,自引:0,他引:8  
The steepest descent method is the simplest gradient method for optimization. It is well known that exact line searches along each steepest descent direction may converge very slowly. An important result was given by Barzilar and Borwein, which is proved to be superlinearly convergent for convex quadratic in two dimensional space, and performs quite well for high dimensional problems. The BB method is not monotone, thus it is not easy to be generalized for general nonlinear functions unless certain non-monotone techniques being applied. Therefore, it is very desirable to find stepsize formulae which enable fast convergence and possess the monotone property. Such a stepsize αk for the steepest descent method is suggested in this paper. An algorithm with this new stepsize in even iterations and exact line search in odd iterations is proposed. Numerical results are presented, which confirm that the new method can find the exact solution within 3 iteration for two dimensional problems. The new method is very efficient for small scale problems. A modified version of the new method is also presented, where the new technique for selecting the stepsize is used after every two exact line searches. The modified algorithm is comparable to the Barzilar-Borwein method for large scale problems and better for small scale problems.  相似文献   

10.
A smoothing method for solving stochastic linear complementarity problems is proposed. The expected residual minimization reformulation of the problem is considered, and it is approximated by the sample average approximation (SAA). The proposed method is based on sequential solving of a sequence of smoothing problems where each of the smoothing problems is defined with its own sample average approximation. A nonmonotone line search with a variant of the Barzilai–Borwein (BB) gradient direction is used for solving each of the smoothing problems. The BB search direction is efficient and low cost, particularly suitable for nonmonotone line search procedure. The variable sample size scheme allows the sample size to vary across the iterations and the method tends to use smaller sample size far away from the solution. The key point of this strategy is a good balance between the variable sample size strategy, the smoothing sequence and nonmonotonicity. Eventually, the maximal sample size is used and the SAA problem is solved. Presented numerical results indicate that the proposed strategy reduces the overall computational cost.  相似文献   

11.
In this paper, by the use of the residual vector and an approximation to the steepest descent direction of the norm function, we develop a norm descent spectral method for solving symmetric nonlinear equations. The method based on the nomonotone line search techniques is showed to be globally convergent. A specific implementation of the method is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) for unconstrained optimization. Preliminary numerical results indicate that the method is promising.  相似文献   

12.
In this paper, we present an efficient method for nonnegative matrix factorization based on the alternating nonnegative least squares framework. Our approach adopts a monotone projected Barzilai–Borwein (MPBB) method as an essential subroutine where the step length is determined without line search. The Lipschitz constant of the gradient is exploited to accelerate convergence. Global convergence of the proposed MPBB method is established. Numerical results are reported to demonstrate the efficiency of our algorithm.  相似文献   

13.
In this paper, a class of minimization problems over density matrices arising in the quantum state estimation is investigated. By making use of the Nesterov’s accelerated strategies, we introduce a modified augmented Lagrangian method to solve it, where the subproblem is tackled by the projected Barzilai–Borwein method with nonmonotone line search. Compared with the existing projected gradient method, several numerical examples are tested to show that the proposed method is efficient and promising.  相似文献   

14.
The cyclic Barzilai--Borwein method for unconstrained optimization   总被引:1,自引:0,他引:1  
** Email: dyh{at}lsec.cc.ac.cn*** Email: hager{at}math.ufl.edu**** Email: klaus.schittkowski{at}uni-bayreuth.de***** Email: hzhang{at}math.ufl.edu In the cyclic Barzilai–Borwein (CBB) method, the sameBarzilai–Borwein (BB) stepsize is reused for m consecutiveiterations. It is proved that CBB is locally linearly convergentat a local minimizer with positive definite Hessian. Numericalevidence indicates that when m > n/2 3, where n is the problemdimension, CBB is locally superlinearly convergent. In the specialcase m = 3 and n = 2, it is proved that the convergence rateis no better than linear, in general. An implementation of theCBB method, called adaptive cyclic Barzilai–Borwein (ACBB),combines a non-monotone line search and an adaptive choice forthe cycle length m. In numerical experiments using the CUTErtest problem library, ACBB performs better than the existingBB gradient algorithm, while it is competitive with the well-knownPRP+ conjugate gradient algorithm.  相似文献   

15.
The present study is an attempt to extend Barzilai and Borwein’s method for dealing with unconstrained single objective optimization problems to multiobjective ones. As compared with Newton, Quasi-Newton and steepest descent multi-objective optimization methods, Barzilai and Borwein multiobjective optimization (BBMO) method requires simple and quick calculations in that it makes no use of the line search methods like the Armijo rule that necessitates function evaluations at each iteration. It goes without saying that the innovative aspect of the current study is due to the use of no function evaluations in comparison with other multi-objective optimization non-parametric methods (e.g. Newton, Quasi-Newton and steepest descent methods, to name a few) that have been investigated so far. Also, the convergence of the BBMO method for the objective functions assumed to be twice continuously differentiable has been proved. MATLAB software was utilized to implement the BBMO method, and the results were compared with the other methods mentioned earlier. Using some performance assessment, the quality of nondominated frontier of BBMO was analogized to above mentioned methods. In addition, the approximate nondominated frontiers gained from the methods were compared with the exact nondominated frontier for some problems. Also, performance profiles are considered to visualize numerical results presented in tables.  相似文献   

16.
The alternating direction method of multipliers (ADMM) has recently received a lot of attention especially due to its capability to harness the power of the new parallel and distributed computing environments. However, ADMM could be notoriously slow especially if the penalty parameter, assigned to the augmented term in the objective function, is not properly chosen. This paper aims to accelerate ADMM by integrating that with the Barzilai–Borwein gradient method and an acceleration technique known as line search. Line search accelerates an iterative method by performing a one-dimensional search along the line segment connecting two successive iterations. We pay a special attention to the large-scale nonnegative least squares problems, and our experiments using real datasets indicate that the integration not only accelerate ADMM but also robustifies that against the penalty parameter.  相似文献   

17.
This paper presents a new method for steplength selection in the frame of spectral gradient methods. The steplength formula is based on the interpolation scheme as well as some modified secant equations. The corresponding algorithm selects the initial positive steplength per iteration according to the satisfaction of the secant condition, and then a backtracking procedure along the negative gradient is performed. The numerical experience shows that this algorithm improves favorably the efficiency property of the standard Barzilai–Borwein method as well as some other recently modified Barzilai–Borwein approaches.  相似文献   

18.
Adaptive Two-Point Stepsize Gradient Algorithm   总被引:7,自引:0,他引:7  
Combined with the nonmonotone line search, the two-point stepsize gradient method has successfully been applied for large-scale unconstrained optimization. However, the numerical performances of the algorithm heavily depend on M, one of the parameters in the nonmonotone line search, even for ill-conditioned problems. This paper proposes an adaptive nonmonotone line search. The two-point stepsize gradient method is shown to be globally convergent with this adaptive nonmonotone line search. Numerical results show that the adaptive nonmonotone line search is specially suitable for the two-point stepsize gradient method.  相似文献   

19.
We propose a new inexact line search rule and analyze the global convergence and convergence rate of related descent methods. The new line search rule is similar to the Armijo line-search rule and contains it as a special case. We can choose a larger stepsize in each line-search procedure and maintain the global convergence of related line-search methods. This idea can make us design new line-search methods in some wider sense. In some special cases, the new descent method can reduce to the Barzilai and Borewein method. Numerical results show that the new line-search methods are efficient for solving unconstrained optimization problems. The work was supported by NSF of China Grant 10171054, Postdoctoral Fund of China, and K. C. Wong Postdoctoral Fund of CAS Grant 6765700. The authors thank the anonymous referees for constructive comments and suggestions that greatly improved the paper.  相似文献   

20.
In nonlinear problems, the Hasofer–Lind–Rackwitz–Fiessler algorithm of the first order reliability method sometimes is puzzled by its non-convergence. A new Hasofer–Lind–Rackwitz–Fiessler algorithm incorporating Barzilai–Borwein step is investigated in this paper to speed up the rate of convergence and performs in a stable manner. The algorithm is essentially established on the basis of the global Barzilai–Borwein gradient method, which is dealt with two stages. The first stage, implemented by the traditional steepest descent method with specific decayed step sizes, prepares a good initial point for the global Barzilai–Borwein gradient algorithm in the second stage, which takes the merit function as the objective to locate the most probable failure point. The efficiency and convergence of the proposed method and some other reliability analysis methods are presented and discussed in details by several numerical examples. It is found that the proposed method is stable and very efficient in the nonlinear problems except those super nonlinear ones, even more accurate than the descent direction method with step sizes following the fixed exponential decay strategy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号