An analysis of reduced Hessian methods for constrained optimization |
| |
Authors: | Richard H. Byrd Jorge Nocedal |
| |
Affiliation: | (1) Department of Computer Science, University of Colorado, 80309 Boulder, CO, USA;(2) Department of Electrical Engineering and Computer Science, Northwestern University, Evanston, USA |
| |
Abstract: | We study the convergence properties of reduced Hessian successive quadratic programming for equality constrained optimization. The method uses a backtracking line search, and updates an approximation to the reduced Hessian of the Lagrangian by means of the BFGS formula. Two merit functions are considered for the line search: the1 function and the Fletcher exact penalty function. We give conditions under which local and superlinear convergence is obtained, and also prove a global convergence result. The analysis allows the initial reduced Hessian approximation to be any positive definite matrix, and does not assume that the iterates converge, or that the matrices are bounded. The effects of a second order correction step, a watchdog procedure and of the choice of null space basis are considered. This work can be seen as an extension to reduced Hessian methods of the well known results of Powell (1976) for unconstrained optimization.This author was supported, in part, by National Science Foundation grant CCR-8702403, Air Force Office of Scientific Research grant AFOSR-85-0251, and Army Research Office contract DAAL03-88-K-0086.This author was supported by the Applied Mathematical Sciences subprogram of the Office of Energy Research, U.S. Department of Energy, under contracts W-31-109-Eng-38 and DE FG02-87ER25047, and by National Science Foundation Grant No. DCR-86-02071. |
| |
Keywords: | Constrained optimization reduced Hessian methods quasi-Newton methods successive quadratic programming nonlinear programming |
本文献已被 SpringerLink 等数据库收录! |
|