首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Jiang  Xianzhen  Liao  Wei  Yin  Jianghua  Jian  Jinbao 《Numerical Algorithms》2022,91(1):161-191

In this paper, based on the hybrid conjugate gradient method and the convex combination technique, a new family of hybrid three-term conjugate gradient methods are proposed for solving unconstrained optimization. The conjugate parameter in the search direction is a hybrid of Dai-Yuan conjugate parameter and any one. The search direction then is the sum of the negative gradient direction and a convex combination in relation to the last search direction and the gradient at the previous iteration. Without choosing any specific conjugate parameters, we show that the search direction generated by the family always possesses the descent property independent of line search technique, and that it is globally convergent under usual assumptions and the weak Wolfe line search. To verify the effectiveness of the presented family, we further design a specific conjugate parameter, and perform medium-large-scale numerical experiments for smooth unconstrained optimization and image restoration problems. The numerical results show the encouraging efficiency and applicability of the proposed methods even compared with the state-of-the-art methods.

  相似文献   

2.
A three-parameter family of nonlinear conjugate gradient methods   总被引:3,自引:0,他引:3  

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the three-parameter family of methods. This paper can also be regarded as a brief review on nonlinear conjugate gradient methods.

  相似文献   


3.
Conjugate gradient methods are interesting iterative methods that solve large scale unconstrained optimization problems. A lot of recent research has thus focussed on developing a number of conjugate gradient methods that are more effective. In this paper, we propose another hybrid conjugate gradient method as a linear combination of Dai-Yuan (DY) method and the Hestenes-Stiefel (HS) method. The sufficient descent condition and the global convergence of this method are established using the generalized Wolfe line search conditions. Compared to the other conjugate gradient methods, the proposed method gives good numerical results and is effective.  相似文献   

4.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

5.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

6.
共轭梯度法是求解大规模无约束优化问题最有效的方法之一.对HS共轭梯度法参数公式进行改进,得到了一个新公式,并以新公式建立一个算法框架.在不依赖于任何线搜索条件下,证明了由算法框架产生的迭代方向均满足充分下降条件,且在标准Wolfe线搜索条件下证明了算法的全局收敛性.最后,对新算法进行数值测试,结果表明所改进的方法是有效的.  相似文献   

7.
By means of a conjugate gradient strategy, we propose a trust region method for unconstrained optimization problems. The search direction is an adequate combination of the conjugate gradient direction and the trust-region direction. The global convergence and the quadratic convergence of this method are established under suitable conditions. Numerical results show that the presented method is competitive to the trust region method and the conjugate gradient method.  相似文献   

8.
本文在文献[1]中提出了一类新共轭梯度法的基础上,给出求解无约束优化问题的两类新的非线性下降共轭梯度法,此两类方法在无任何线搜索下,能够保证在每次迭代中产生下降方向.对一般非凸函数,我们在Wolfe线搜索条件下证明了两类新方法的全局收敛性.  相似文献   

9.
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the solution point, the next search direction will be close to the negative gradient direction; and (ii) its descent property and global convergence are likely to be achieved provided that the line search satisfies the Wolfe conditions. Some numerical results with the family are also presented.

  相似文献   


10.
共轭梯度法是一类具有广泛应用的求解大规模无约束优化问题的方法. 提出了一种新的非线性共轭梯度(CG)法,理论分析显示新算法在多种线搜索条件下具有充分下降性. 进一步证明了新CG算法的全局收敛性定理. 最后,进行了大量数值实验,其结果表明与传统的几类CG方法相比,新算法具有更为高效的计算性能.  相似文献   

11.
A new family of conjugate gradient methods   总被引:1,自引:0,他引:1  
In this paper we develop a new class of conjugate gradient methods for unconstrained optimization problems. A new nonmonotone line search technique is proposed to guarantee the global convergence of these conjugate gradient methods under some mild conditions. In particular, Polak–Ribiére–Polyak and Liu–Storey conjugate gradient methods are special cases of the new class of conjugate gradient methods. By estimating the local Lipschitz constant of the derivative of objective functions, we can find an adequate step size and substantially decrease the function evaluations at each iteration. Numerical results show that these new conjugate gradient methods are effective in minimizing large-scale non-convex non-quadratic functions.  相似文献   

12.
Min Li 《Optimization Letters》2018,12(8):1911-1927
Based on the memoryless BFGS quasi-Newton method, a family of three-term nonlinear conjugate gradient methods are proposed. For any line search, the directions generated by the new methods are sufficient descent. Using some efficient techniques, global convergence results are established when the line search fulfills the Wolfe or the Armijo conditions. Moreover, the r-linear convergence rate of the methods are analyzed as well. Numerical comparisons show that the proposed methods are efficient for the unconstrained optimization problems in the CUTEr library.  相似文献   

13.
In this paper, we propose a three-term conjugate gradient method via the symmetric rank-one update. The basic idea is to exploit the good properties of the SR1 update in providing quality Hessian approximations to construct a conjugate gradient line search direction without the storage of matrices and possess the sufficient descent property. Numerical experiments on a set of standard unconstrained optimization problems showed that the proposed method is superior to many well-known conjugate gradient methods in terms of efficiency and robustness.  相似文献   

14.
Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction.  相似文献   

15.
A family of new conjugate gradient methods is proposed based on Perry’s idea, which satisfies the descent property or the sufficient descent property for any line search. In addition, based on the scaling technology and the restarting strategy, a family of scaling symmetric Perry conjugate gradient methods with restarting procedures is presented. The memoryless BFGS method and the SCALCG method are the special forms of the two families of new methods, respectively. Moreover, several concrete new algorithms are suggested. Under Wolfe line searches, the global convergence of the two families of the new methods is proven by the spectral analysis for uniformly convex functions and nonconvex functions. The preliminary numerical comparisons with CG_DESCENT and SCALCG algorithms show that these new algorithms are very effective algorithms for the large-scale unconstrained optimization problems. Finally, a remark for further research is suggested.  相似文献   

16.
王开荣  刘奔 《计算数学》2012,34(1):81-92
共轭梯度法是一类非常重要的用于解决大规模无约束优化问题的方法. 本文通过修正的BFGS公式提出了一个新的共轭梯度方法. 该方法具有不依赖于线搜索的充分下降性. 对于一般的非线性函数, 证明了该方法的全局收敛性. 数值结果表明该方法是有效的.  相似文献   

17.
限制PR共轭梯度法及其全局收敛性   总被引:5,自引:0,他引:5  
时贞军 《数学进展》2002,31(1):47-55
PR共轭梯度法是求解大型无约束优化问题的有效算法之一,但是算法的全局收敛性在理论上一直没有得到解决。本文将PR共轭梯度法中的参数β加以限制,提出了限制R共轭梯度法,证明了Armijo搜索下算法的全局收敛性、数值试验表明算法是很有效的。  相似文献   

18.
Minimizing two different upper bounds of the matrix which generates search directions of the nonlinear conjugate gradient method proposed by Dai and Liao, two modified conjugate gradient methods are proposed. Under proper conditions, it is briefly shown that the methods are globally convergent when the line search fulfills the strong Wolfe conditions. Numerical comparisons between the implementations of the proposed methods and the conjugate gradient methods proposed by Hager and Zhang, and Dai and Kou, are made on a set of unconstrained optimization test problems of the CUTEr collection. The results show the efficiency of the proposed methods in the sense of the performance profile introduced by Dolan and Moré.  相似文献   

19.
Based on two modified secant equations proposed by Yuan, and Li and Fukushima, we extend the approach proposed by Andrei, and introduce two hybrid conjugate gradient methods for unconstrained optimization problems. Our methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. Under proper conditions, we show that one of the proposed algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. To enhance the performance of the line search procedure, we propose a new approach for computing the initial value of the steplength for initiating the line search procedure. We give a comparison of the implementations of our algorithms with two efficiently representative hybrid conjugate gradient methods proposed by Andrei using unconstrained optimization test problems from the CUTEr collection. Numerical results show that, in the sense of the performance profile introduced by Dolan and Moré, the proposed hybrid algorithms are competitive, and in some cases more efficient.  相似文献   

20.
This article proposes new conjugate gradient method for unconstrained optimization by applying the Powell symmetrical technique in a defined sense. Using the Wolfe line search conditions, the global convergence property of the method is also obtained based on the spectral analysis of the conjugate gradient iteration matrix and the Zoutendijk condition for steepest descent methods. Preliminary numerical results for a set of 86 unconstrained optimization test problems verify the performance of the algorithm and show that the Generalized Descent Symmetrical Hestenes-Stiefel algorithm is competitive with the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP+) algorithms.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号