首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A new family of conjugate gradient methods for unconstrained optimization
Authors:Ming Li  Hongwei Liu  Zexian Liu
Institution:1.College of Mathematics and Statistics,Xidian University,Xi’an,China;2.School of Mathematics and Computer Science,Hezhou University,Hezhou,China
Abstract:A new family of conjugate gradient methods is proposed by minimizing the distance between two certain directions. It is a subfamily of Dai–Liao family, which consists of Hager–Zhang family and Dai–Kou method. The direction of the proposed method is an approximation to that of the memoryless Broyden–Fletcher–Goldfarb–Shanno method. With the suitable intervals of parameters, the direction of the proposed method possesses the sufficient descent property independent of the line search. Under mild assumptions, we analyze the global convergence of the method for strongly convex functions and general functions where the stepsize is obtained by the standard Wolfe rules. Numerical results indicate that the proposed method is a promising method which outperforms CGOPT and CG_DESCENT on a set of unconstrained optimization testing problems.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号