首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Generalized pattern searches with derivative information
Authors:Email author" target="_blank">Mark A?AbramsonEmail author  Charles?Audet  Jr" target="_blank">JE?DennisJr
Institution:(1) Air Force Institute of Technology, 2950 Hobson Way, Building 640, Wright Patterson AFB, Ohio, 45433-7765;(2) École Polytechnique de Montréal and GERADD, Département de Mathématiques et de Génie Industriel, C.P 6079 Succ Centre-ville Montréal (Québec), H3C 3A7, Canada;(3) Rice University, Department of Computational and Applied Mathematics, 8419 42nd Ave SW, Seattle, Washington, 98136
Abstract:A common question asked by users of direct search algorithms is how to use derivative information at iterates where it is available. This paper addresses that question with respect to Generalized Pattern Search (GPS) methods for unconstrained and linearly constrained optimization. Specifically, this paper concentrates on the GPS pollstep. Polling is done to certify the need to refine the current mesh, and it requires O(n) function evaluations in the worst case. We show that the use of derivative information significantly reduces the maximum number of function evaluations necessary for pollsteps, even to a worst case of a single function evaluation with certain algorithmic choices given here. Furthermore, we show that rather rough approximations to the gradient are sufficient to reduce the pollstep to a single function evaluation. We prove that using these less expensive pollsteps does not weaken the known convergence properties of the method, all of which depend only on the pollstep.
Keywords:pattern search algorithm  linearly constrained optimization  surrogate-based optimization  nonsmooth optimization  gradient-based optimization
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号