首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A descent hybrid conjugate gradient method based on the memoryless BFGS update
Authors:Ioannis E Livieris  Vassilis Tampakas  Panagiotis Pintelas
Institution:1.Department of Computer Engineering & Informatics,Technological Educational Institute of Western Greece,Patras,Greece;2.Department of Mathematics,University of Patras,Patras,Greece
Abstract:In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号