首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Another hybrid conjugate gradient algorithm for unconstrained optimization
Authors:Neculai Andrei
Institution:(1) Research Institute for Informatics, Center for Advanced Modeling and Optimization, 8-10, Averescu Avenue, Bucharest 1, Romania
Abstract:Another hybrid conjugate gradient algorithm is subject to analysis. The parameter β k is computed as a convex combination of $$ \beta ^{{HS}}_{k}  $$ (Hestenes-Stiefel) and $$ \beta ^{{DY}}_{k}  $$ (Dai-Yuan) algorithms, i.e. $$ \beta ^{C}_{k}  = {\left( {1 - \theta _{k} } \right)}\beta ^{{HS}}_{k}  + \theta _{k} \beta ^{{DY}}_{k}  $$. The parameter θ k in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (s k , y k ) to satisfy the quasi-Newton equation $$ \nabla ^{2} f{\left( {x_{{k + 1}} } \right)}s_{k}  = y_{k}  $$, where $$ s_{k}  = x_{{k + 1}}  - x_{k}  $$ and $$ y_{k}  = g_{{k + 1}}  - g_{k}  $$. The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.
Keywords:Unconstrained optimization  Hybrid conjugate gradient method  Newton direction  Numerical comparisons
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号