首页 | 本学科首页   官方微博 | 高级检索  
     


Truncated Newton method for sparse unconstrained optimization using automatic differentiation
Authors:L. C. W. Dixon  R. C. Price
Affiliation:(1) School of Information Sciences, Hatfield Polytechnic, Hatfield, Hertfordshire, England;(2) Numerical Optimisation Centre, Hatfield Polytechnic, Hatfield, Hertfordshire, England
Abstract:When solving large complex optimization problems, the user is faced with three major problems. These are (i) the cost in human time in obtaining accurate expressions for the derivatives involved; (ii) the need to store second derivative information; and (iii), of lessening importance, the time taken to solve the problem on the computer. For many problems, a significant part of the latter can be attributed to solving Newton-like equations. In the algorithm described, the equations are solved using a conjugate direction method that only needs the Hessian at the current point when it is multiplied by a trial vector. In this paper, we present a method that finds this product using automatic differentiation while only requiring vector storage. The method takes advantage of any sparsity in the Hessian matrix and computes exact derivatives. It avoids the complexity of symbolic differentiation, the inaccuracy of numerical differentiation, the labor of finding analytic derivatives, and the need for matrix store. When far from a minimum, an accurate solution to the Newton equations is not justified, so an approximate solution is obtained by using a version of Dembo and Steihaug's truncated Newton algorithm (Ref. 1).This paper was presented at the SIAM National Meeting, Boston, Massachusetts, 1986.
Keywords:Optimization  truncated Newton method  automatic differentiation
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号