首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到10条相似文献,搜索用时 93 毫秒
1.
Based on the modified secant equation, we propose two new HS type conjugate gradient formulas. Their forms are similar to the original HS conjugate gradient formula and inherit all nice properties of the HS method. By utilizing the technique of the three-term HS method in Zhang et al. (2007) [15], without the requirement of truncation and convexity of the objective function, we show that one with Wolfe line search and the other with Armijo line search are globally convergent. Moreover, under some mild conditions, the linear convergence rate of the two modified methods is established. The numerical results show that the proposed methods are efficient.  相似文献   

2.
刘金魁 《计算数学》2013,35(3):286-296
根据CG-DESCENT算法[1]的结构和Powell在综述文献[11]中的建议,给出了两种新的求解无约束优化问题的非线性共轭梯度算法. 它们在任意线搜索下都具有充分下降性质, 并在标准Wolfe线搜索下对一般函数能够保证全局收敛性. 通过对CUTEr函数库中部分著名的函数进行试验, 并借助著名的Dolan & Moré[2]评价方法, 展示了新算法的有效性.  相似文献   

3.
The Hestenes–Stiefel (HS) method is an efficient method for solving large-scale unconstrained optimization problems. In this paper, we extend the HS method to solve constrained nonlinear equations, and propose a modified HS projection method, which combines the modified HS method proposed by Zhang et al. with the projection method developed by Solodov and Svaiter. Under some mild assumptions, we show that the new method is globally convergent with an Armijo line search. Moreover, the R-linear convergence rate of the new method is established. Some preliminary numerical results show that the new method is efficient even for large-scale constrained nonlinear equations.  相似文献   

4.
Tanabe (1988) proposed a variation of the classical Newton method for solving nonlinear systems of equations, the so-called Centered Newton method. His idea was based on a deviation of the Newton direction towards a variety called “Central Variety”. In this paper we prove that the Centered Newton method is locally convergent and we present a globally convergent method based on the centered direction used by Tanabe. We show the effectiveness of our proposal for solving nonlinear system of equations and compare with the Newton method with line search.  相似文献   

5.
《Optimization》2012,61(9):1387-1400
Although the Hesteness and Stiefel (HS) method is a well-known method, if an inexact line search is used, researches about its convergence rate are very rare. Recently, Zhang, Zhou and Li [Some descent three-term conjugate gradient methods and their global convergence, Optim. Method Softw. 22 (2007), pp. 697–711] proposed a three-term Hestenes–Stiefel method for unconstrained optimization problems. In this article, we investigate the convergence rate of this method. We show that the three-term HS method with the Wolfe line search will be n-step superlinearly and even quadratically convergent if some restart technique is used under reasonable conditions. Some numerical results are also reported to verify the theoretical results. Moreover, it is more efficient than the previous ones.  相似文献   

6.
对求解无约束规划的超记忆梯度算法中线搜索方向中的参数,给了一个假设条件,从而确定了它的一个新的取值范围,保证了搜索方向是目标函数的充分下降方向,由此提出了一类新的记忆梯度算法.在去掉迭代点列有界和Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的FR、PR、HS共轭梯度法和超记忆梯度法更稳定、更有效.  相似文献   

7.
改进HS共轭梯度算法及其全局收敛性   总被引:14,自引:0,他引:14  
时贞军 《计算数学》2001,23(4):393-406
1.引 言 1952年 M.Hestenes和E.Stiefel提出了求解正定线性方程组的共轭梯度法[1].1964年R.Fletcher和C.Reeves将该方法推广到求解下列无约束优化问题: minf(x),x∈Rn,(1)其中f:Rn→R1为连续可微函数,记gk= f(xk),xk∈ Rn. 若点列{xk}由如下算法产生:其中 βk=[gTk(gk-gk-1)]/[dTk-1(gk-gk-1)].(Hestenes-Stiefel)  (4)则称该算法为 Hestenes—Stiefel共轭梯度算…  相似文献   

8.
一个修正HS共轭梯度法及其收敛性   总被引:2,自引:0,他引:2  
It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βkH Sffikeeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods.  相似文献   

9.
Two modified Dai-Yuan nonlinear conjugate gradient methods   总被引:1,自引:0,他引:1  
In this paper, we propose two modified versions of the Dai-Yuan (DY) nonlinear conjugate gradient method. One is based on the MBFGS method (Li and Fukushima, J Comput Appl Math 129:15–35, 2001) and inherits all nice properties of the DY method. Moreover, this method converges globally for nonconvex functions even if the standard Armijo line search is used. The other is based on the ideas of Wei et al. (Appl Math Comput 183:1341–1350, 2006), Zhang et al. (Numer Math 104:561–572, 2006) and possesses good performance of the Hestenes-Stiefel method. Numerical results are also reported. This work was supported by the NSF foundation (10701018) of China.  相似文献   

10.
Although the study of global convergence of the Polak–Ribière–Polyak (PRP), Hestenes–Stiefel (HS) and Liu–Storey (LS) conjugate gradient methods has made great progress, the convergence of these algorithms for general nonlinear functions is still erratic, not to mention under weak conditions on the objective function and weak line search rules. Besides, it is also interesting to investigate whether there exists a general method that converges under the standard Armijo line search for general nonconvex functions, since very few relevant results have been achieved. So in this paper, we present a new general form of conjugate gradient methods whose theoretical significance is attractive. With any formula β k  ≥ 0 and under weak conditions, the proposed method satisfies the sufficient descent condition independently of the line search used and the function convexity, and its global convergence can be achieved under the standard Wolfe line search or even under the standard Armijo line search. Based on this new method, convergence results on the PRP, HS, LS, Dai–Yuan–type (DY) and Conjugate–Descent–type (CD) methods are established. Preliminary numerical results show the efficiency of the proposed methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号