首页 | 本学科首页   官方微博 | 高级检索  
     


An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
Authors:Zexian Liu  Hongwei Liu
Affiliation:1.School of Mathematics and Statistics,Xidian University,Xi’an,People’s Republic of China;2.School of Mathematics and Computer Science,Hezhou University,Hezhou,People’s Republic of China
Abstract:In this paper, we introduce a new concept of approximate optimal stepsize for gradient method, use it to interpret the Barzilai-Borwein (BB) method, and present an efficient gradient method with approximate optimal stepsize for large unconstrained optimization. If the objective function f is not close to a quadratic on a line segment between the current iterate x k and the latest iterate x k?1, we construct a conic model to generate the approximate optimal stepsize for gradient method if the conic model is suitable to be used. Otherwise, we construct a new quadratic model or two other new approximation models to generate the approximate optimal stepsize for gradient method. We analyze the convergence of the proposed method under some suitable conditions. Numerical results show the proposed method is very promising.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号