首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Global and Superlinear Convergence of a Restricted Class of Self-Scaling Methods with Inexact Line Searches, for Convex Functions
Authors:M Al-Baali
Institution:(1) Department of Mathematics and Statistics, Sultan Gaboos University, Sultanate of Oman
Abstract:This paper studies the convergence properties of algorithms belonging to the class of self-scaling (SS) quasi-Newton methods for unconstrained optimization. This class depends on two parameters, say theta k and tau k , for which the choice tau k =1 gives the Broyden family of unscaled methods, where theta k =1 corresponds to the well known DFP method. We propose simple conditions on these parameters that give rise to global convergence with inexact line searches, for convex objective functions. The q-superlinear convergence is achieved if further restrictions on the scaling parameter are introduced. These convergence results are an extension of the known results for the unscaled methods. Because the scaling parameter is heavily restricted, we consider a subclass of SS methods which satisfies the required conditions. Although convergence for the unscaled methods with theta k ge 1 is still an open question, we show that the global and superlinear convergence for SS methods is possible and present, in particular, a new SS-DFP method.
Keywords:Quasi-Newton methods  Broyden's class  self-scaling  inexact line searches  global and superlinear convergence
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号