首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Li  Xiangli  Zhao  Wenjuan 《Numerical Algorithms》2022,90(3):1017-1042
Numerical Algorithms - There are many conjugate gradient methods to solving unconstrained optimization problems. Compared with the conjugate gradient method, the accelerated conjugate gradient...  相似文献   

2.
In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.  相似文献   

3.
In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martínez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré.  相似文献   

4.
In this paper, we describe an implementation and give performance results for a conjugate gradient algorithm for unconstrained optimization. The algorithm is based upon the Nazareth three-term formula and incorporates Allwright preconditioning matrices and restart tests. The performance results for this combination compare favorably with existing codes.The support of the Science and Engineering Research Council is gratefully acknowledged.  相似文献   

5.
A three-parameter family of nonlinear conjugate gradient methods   总被引:3,自引:0,他引:3  

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the three-parameter family of methods. This paper can also be regarded as a brief review on nonlinear conjugate gradient methods.

  相似文献   


6.
《Optimization》2012,61(4):993-1009
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new two-parameter family of conjugate gradient methods for unconstrained optimization. The two-parameter family of methods not only includes the already existing three practical nonlinear conjugate gradient methods, but has other family of conjugate gradient methods as subfamily. The two-parameter family of methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the two-parameter family of methods. The numerical results show that this method is efficient for the given test problems. In addition, the methods related to this family are uniformly discussed.  相似文献   

7.
Jiang  Xianzhen  Liao  Wei  Yin  Jianghua  Jian  Jinbao 《Numerical Algorithms》2022,91(1):161-191

In this paper, based on the hybrid conjugate gradient method and the convex combination technique, a new family of hybrid three-term conjugate gradient methods are proposed for solving unconstrained optimization. The conjugate parameter in the search direction is a hybrid of Dai-Yuan conjugate parameter and any one. The search direction then is the sum of the negative gradient direction and a convex combination in relation to the last search direction and the gradient at the previous iteration. Without choosing any specific conjugate parameters, we show that the search direction generated by the family always possesses the descent property independent of line search technique, and that it is globally convergent under usual assumptions and the weak Wolfe line search. To verify the effectiveness of the presented family, we further design a specific conjugate parameter, and perform medium-large-scale numerical experiments for smooth unconstrained optimization and image restoration problems. The numerical results show the encouraging efficiency and applicability of the proposed methods even compared with the state-of-the-art methods.

  相似文献   

8.
9.
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.  相似文献   

10.
A new family of conjugate gradient methods   总被引:1,自引:0,他引:1  
In this paper we develop a new class of conjugate gradient methods for unconstrained optimization problems. A new nonmonotone line search technique is proposed to guarantee the global convergence of these conjugate gradient methods under some mild conditions. In particular, Polak–Ribiére–Polyak and Liu–Storey conjugate gradient methods are special cases of the new class of conjugate gradient methods. By estimating the local Lipschitz constant of the derivative of objective functions, we can find an adequate step size and substantially decrease the function evaluations at each iteration. Numerical results show that these new conjugate gradient methods are effective in minimizing large-scale non-convex non-quadratic functions.  相似文献   

11.
In this paper, we propose a family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations. They come from two modified conjugate gradient methods [W.Y. Cheng, A two term PRP based descent Method, Numer. Funct. Anal. Optim. 28 (2007) 1217–1230; L. Zhang, W.J. Zhou, D.H. Li, A descent modified Polak–Ribiére–Polyak conjugate gradient method and its global convergence, IMA J. Numer. Anal. 26 (2006) 629–640] recently proposed for unconstrained optimization problems. Under appropriate conditions, the global convergence of the proposed method is established. Preliminary numerical results show that the proposed method is promising.  相似文献   

12.
Summary. The Generalized Conjugate Gradient method (see [1]) is an iterative method for nonsymmetric linear systems. We obtain generalizations of this method for nonlinear systems with nonsymmetric Jacobians. We prove global convergence results. Received April 29, 1992 / Revised version received November 18, 1993  相似文献   

13.
14.
In this paper, we propose a three-term conjugate gradient method via the symmetric rank-one update. The basic idea is to exploit the good properties of the SR1 update in providing quality Hessian approximations to construct a conjugate gradient line search direction without the storage of matrices and possess the sufficient descent property. Numerical experiments on a set of standard unconstrained optimization problems showed that the proposed method is superior to many well-known conjugate gradient methods in terms of efficiency and robustness.  相似文献   

15.
In this paper, a three-term conjugate gradient algorithm is developed for solving large-scale unconstrained optimization problems. The search direction at each iteration of the algorithm is determined by rectifying the steepest descent direction with the difference between the current iterative points and that between the gradients. It is proved that such a direction satisfies the approximate secant condition as well as the conjugacy condition. The strategies of acceleration and restart are incorporated into designing the algorithm to improve its numerical performance. Global convergence of the proposed algorithm is established under two mild assumptions. By implementing the algorithm to solve 75 benchmark test problems available in the literature, the obtained results indicate that the algorithm developed in this paper outperforms the existent similar state-of-the-art algorithms.  相似文献   

16.
Two modified Dai-Yuan nonlinear conjugate gradient methods   总被引:1,自引:0,他引:1  
In this paper, we propose two modified versions of the Dai-Yuan (DY) nonlinear conjugate gradient method. One is based on the MBFGS method (Li and Fukushima, J Comput Appl Math 129:15–35, 2001) and inherits all nice properties of the DY method. Moreover, this method converges globally for nonconvex functions even if the standard Armijo line search is used. The other is based on the ideas of Wei et al. (Appl Math Comput 183:1341–1350, 2006), Zhang et al. (Numer Math 104:561–572, 2006) and possesses good performance of the Hestenes-Stiefel method. Numerical results are also reported. This work was supported by the NSF foundation (10701018) of China.  相似文献   

17.
Abstract. The global convergence of the general three-term conjugate gradient methods withthe relaxed strong Wolfe line-search is proved.  相似文献   

18.
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the solution point, the next search direction will be close to the negative gradient direction; and (ii) its descent property and global convergence are likely to be achieved provided that the line search satisfies the Wolfe conditions. Some numerical results with the family are also presented.

  相似文献   


19.
Numerical Algorithms - In this work, we propose an algorithm for solving system of nonlinear equations. The idea is a combination of the descent Dai-Liao method by Babaie-Kafaki and Gambari (Optim....  相似文献   

20.
A new family of conjugate gradient methods is proposed by minimizing the distance between two certain directions. It is a subfamily of Dai–Liao family, which consists of Hager–Zhang family and Dai–Kou method. The direction of the proposed method is an approximation to that of the memoryless Broyden–Fletcher–Goldfarb–Shanno method. With the suitable intervals of parameters, the direction of the proposed method possesses the sufficient descent property independent of the line search. Under mild assumptions, we analyze the global convergence of the method for strongly convex functions and general functions where the stepsize is obtained by the standard Wolfe rules. Numerical results indicate that the proposed method is a promising method which outperforms CGOPT and CG_DESCENT on a set of unconstrained optimization testing problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号