首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Study on a memory gradient method for the minimization of functions
Authors:A Miele  J W Cantrell
Institution:(1) Department of Mechanical and Aerospace Engineering and Materials Science, Rice University, Houston, Texas;(2) Present address: LTV Aerospace Corporation, Dallas, Texas
Abstract:A new accelerated gradient method for finding the minimum of a functionf(x) whose variables are unconstrained is investigated. The new algorithm can be stated as follows: 
$$\tilde x = x + \delta x,\delta x =  - \alpha g(x) + \beta \delta \hat x$$
where deltax is the change in the position vectorx, g(x) is the gradient of the functionf(x), and agr and beta are scalars chosen at each step so as to yield the greatest decrease in the function. The symbol 
$$\delta \hat x$$
denotes the change in the position vector for the iteration preceding that under consideration.For a nonquadratic function, initial convergence of the present method is faster than that of the Fletcher-Reeves method because of the extra degree of freedom available. For a test problem, the number of iterations was about 40–50% that of the Fletcher-Reeves method and the computing time about 60–75% that of the Fletcher-Reeves method, using comparable search techniques.This research, supported by the Office of Scientific Research, Office of Aerospace Research, United States Air Force, Grant No. AF-AFOSR-828-67, is a condensed version of the investigation described in Ref. 1. Portions of this paper were presented by the senior author at the International Symposium on Optimization Methods, Nice, France, 1969.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号