首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 23 毫秒
1.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

2.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

3.
A new conjugate gradient method is proposed in this paper. For any (inexact) line search, our scheme satifies the sufficient descent property. The method is proved to be globally convergent if the restricted Wolfe-Powell line search is used. Preliminary numerical result shows that it is efficient.  相似文献   

4.
刘金魁 《计算数学》2013,35(3):286-296
根据CG-DESCENT算法[1]的结构和Powell在综述文献[11]中的建议,给出了两种新的求解无约束优化问题的非线性共轭梯度算法. 它们在任意线搜索下都具有充分下降性质, 并在标准Wolfe线搜索下对一般函数能够保证全局收敛性. 通过对CUTEr函数库中部分著名的函数进行试验, 并借助著名的Dolan & Moré[2]评价方法, 展示了新算法的有效性.  相似文献   

5.
推广线搜索下一类共轭梯度法的全局收敛性   总被引:2,自引:0,他引:2  
在推广线搜索下给出了一类共轭梯度法的全局收敛结果  相似文献   

6.
《Optimization》2012,61(4):993-1009
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new two-parameter family of conjugate gradient methods for unconstrained optimization. The two-parameter family of methods not only includes the already existing three practical nonlinear conjugate gradient methods, but has other family of conjugate gradient methods as subfamily. The two-parameter family of methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the two-parameter family of methods. The numerical results show that this method is efficient for the given test problems. In addition, the methods related to this family are uniformly discussed.  相似文献   

7.
徐泽水 《数学杂志》2002,22(1):27-30
本文提出了一类新的共轭梯度法,在算法的迭代过程中,迭代方向保持下降性,并在一类非精确性搜索条件下证明了其全局收敛性。  相似文献   

8.
In this paper, a new gradient-related algorithm for solving large-scale unconstrained optimization problems is proposed. The new algorithm is a kind of line search method. The basic idea is to choose a combination of the current gradient and some previous search directions as a new search direction and to find a step-size by using various inexact line searches. Using more information at the current iterative step may improve the performance of the algorithm. This motivates us to find some new gradient algorithms which may be more effective than standard conjugate gradient methods. Uniformly gradient-related conception is useful and it can be used to analyze global convergence of the new algorithm. The global convergence and linear convergence rate of the new algorithm are investigated under diverse weak conditions. Numerical experiments show that the new algorithm seems to converge more stably and is superior to other similar methods in many situations.  相似文献   

9.
在Goldstein搜索下一类共轭梯度法的全局收敛性   总被引:3,自引:0,他引:3  
徐泽水 《数学杂志》2000,20(1):13-16
本文证明了文「1」提出的一类共轭梯度法在Goldstein非精确线性搜索下具有全局收敛性。  相似文献   

10.
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the solution point, the next search direction will be close to the negative gradient direction; and (ii) its descent property and global convergence are likely to be achieved provided that the line search satisfies the Wolfe conditions. Some numerical results with the family are also presented.

  相似文献   


11.
共轭梯度法是最优化中最常用的方法之一,广泛地应用于求解大规模优化问题,其中参数β_k的不同选取可以构成不同的共轭梯度法.给出了一类含有三个参数的共轭梯度算法,这种算法能够在给定的条件下证明选定的β_k在每一步都能产生一个下降方向,同时在强Wolfe线搜索下,这种算法具有全局收敛性.  相似文献   

12.
A three-parameter family of nonlinear conjugate gradient methods   总被引:3,自引:0,他引:3  

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the three-parameter family of methods. This paper can also be regarded as a brief review on nonlinear conjugate gradient methods.

  相似文献   


13.
共轭梯度法是求解无约束优化问题的一种重要的方法.本文提出一族新的共轭梯度法,证明了其在推广的Wolfe非精确线搜索条件下具有全局收敛性.最后对算法进行了数值实验,实验结果验证了该算法的有效性.  相似文献   

14.
连淑君  王长钰 《应用数学》2007,20(1):120-127
本文我们讨论了一簇共轭梯度法,它可被看作是FR法和DY法的凸组合.我们提出了两种Armijo型线搜索,并在这两种线搜索下,讨论了共轭梯度法簇的全局收敛性.  相似文献   

15.
This paper explores the convergence of nonlinear conjugate gradient methods with Goldstein line search without regular restarts. Under this line search, global convergence for a subsequence is given for the famous conjugate gradient methods, Fletcher-Reeves method. The same result can be obtained for Polak-Ribiére-Polyak method and others. *This work was partially supported by National Hitech Program (863,2002AA104540) and National Natural Science Foundation of China (No.60373060).  相似文献   

16.
Conjugate Gradient Methods with Armijo-type Line Searches   总被引:14,自引:0,他引:14  
Abstract Two Armijo-type line searches are proposed in this paper for nonlinear conjugate gradient methods.Under these line searches, global convergence results are established for several famous conjugate gradientmethods, including the Fletcher-Reeves method, the Polak-Ribiere-Polyak method, and the conjugate descentmethod.  相似文献   

17.
本文提出了一类与HS方法相关的新的共轭梯度法.在强Wolfe线搜索的条件下,该方法能够保证搜索方向的充分下降性,并且在不需要假设目标函数为凸的情况下,证明了该方法的全局收敛性.同时,给出了这类新共轭梯度法的一种特殊形式,通过调整参数ρ,验证了它对给定测试函数的有效性.  相似文献   

18.
一族新的共轭梯度法的全局收敛性   总被引:1,自引:0,他引:1  
共轭梯度法是求解无约束优化问题的一种重要的方法,尤其适用于大规模优化问题的求解。本文提出一族新的共轭梯度法,证明了其在推广的Wolfe非精确线搜索条件下具有全局收敛性。最后对算法进行了数值试验,试验结果验证了该算法的有效性。  相似文献   

19.
共轭梯度法是求解大规模无约束优化问题最有效的方法之一.对HS共轭梯度法参数公式进行改进,得到了一个新公式,并以新公式建立一个算法框架.在不依赖于任何线搜索条件下,证明了由算法框架产生的迭代方向均满足充分下降条件,且在标准Wolfe线搜索条件下证明了算法的全局收敛性.最后,对新算法进行数值测试,结果表明所改进的方法是有效的.  相似文献   

20.
This paper proposes a line search technique to satisfy a relaxed form of the strong Wolfe conditions in order to guarantee the descent condition at each iteration of the Polak-Ribière-Polyak conjugate gradient algorithm. It is proved that this line search algorithm preserves the usual convergence properties of any descent algorithm. In particular, it is shown that the Zoutendijk condition holds under mild assumptions. It is also proved that the resulting conjugate gradient algorithm is convergent under a strong convexity assumption. For the nonconvex case, a globally convergent modification is proposed. Numerical tests are presented. This paper is based on an earlier work presented at the International Symposium on Mathematical Programming in Lausanne in 1997. The author thanks J. C. Gilbert for his advice and M. Albaali for some recent discussions which motivated him to write this paper. Special thanks to G. Liu, J. Nocedal, and R. Waltz for the availability of the software CG+ and to one of the referees who indicated to him the paper of Grippo and Lucidi (Ref. 1).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号