首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A superlinearly convergent algorithm for minimization without evaluating derivatives
Authors:Robert Mifflin
Institution:(1) Yale University, New Haven, Conn., USA
Abstract:An algorithm for unconstrained minimization of a function of n variables that does not require the evaluation of partial derivatives is presented. It is a second order extension of the method of local variations and it does not require any exact one variable minimizations. This method retains the local variations property of accumulation points being stationary for a continuously differentiable function. Furthermore, because this extension makes the algorithm an approximate Newton method, its convergence is superlinear for a twice continuously differentiable strongly convex function.Research sponsored by National Science Foundation Grant GK-32710 and by the Air Force Office of Scientific Research, Air Force Systems Command, USAF, under Grant No. AFOSR-74-2695.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号