首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
Authors:Saman Babaie-Kafaki  Masoud Fatemi  Nezam Mahdavi-Amiri
Institution:(3) Center for Advanced Modeling and Optimization, Research Institute for Informatics, Bucharest, Romania;(4) Academy of Romanian Scientists, Bucharest, Romania;
Abstract:Based on two modified secant equations proposed by Yuan, and Li and Fukushima, we extend the approach proposed by Andrei, and introduce two hybrid conjugate gradient methods for unconstrained optimization problems. Our methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. Under proper conditions, we show that one of the proposed algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. To enhance the performance of the line search procedure, we propose a new approach for computing the initial value of the steplength for initiating the line search procedure. We give a comparison of the implementations of our algorithms with two efficiently representative hybrid conjugate gradient methods proposed by Andrei using unconstrained optimization test problems from the CUTEr collection. Numerical results show that, in the sense of the performance profile introduced by Dolan and Moré, the proposed hybrid algorithms are competitive, and in some cases more efficient.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号