首页 | 本学科首页   官方微博 | 高级检索  
     


Using parallel function evaluations to improve hessian approximation for unconstrained optimization
Authors:Richard H. Byrd  Robert B. Schnabel  Gerald A. Shultz
Affiliation:(1) Department of Computer Science, University of Colorado, 80309 Boulder, Co, USA;(2) Department of Mathematical Sciences, Metropolitan State College, 80204 Denver, Co, USA
Abstract:This paper presents a new class of methods for solving unconstrained optimization problems on parallel computers. The methods are intended to solve small to moderate dimensional problems where function and derivative evaluation is the dominant cost. They utilize multiple processors to evaluate the function, (finite difference) gradient, and a portion of the finite difference Hessian simultaneously at each iterate. We introduce three types of new methods, which all utilize the new finite difference Hessian information in forming the new Hessian approximation at each iteration; they differ in whether and how they utilize the standard secant information from the current step as well. We present theoretical analyses of the rate of convergence of several of these methods. We also present computational results which illustrate their performance on parallel computers when function evaluation is expensive.Research supported by AFOSR grant AFOSR-85-0251, ARO contract DAAG 29-84-K-0140, NSF grant DCR-8403483, and NFS cooperative agreement DCR -8420944.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号