首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Minimizing a differentiable function over a differential manifold
Authors:D Gabay
Institution:(1) Laboratoire d'Analyse Numérique, Université P. et M. Curie, Paris, France;(2) INRIA, Domaine de Voluceau, Le Chesnay, France
Abstract:To generalize the descent methods of unconstrained optimization to the constrained case, we define intrinsically the gradient field of the objective function on the constraint manifold and analyze descent methods along geodesics, including the gradient projection and reduced gradient methods for special choices of coordinate systems. In particular, we generalize the quasi-Newton methods and establish their superlinear convergence; we show that they only require the updating of a reduced size matrix. In practice, the geodesic search is approximated by a tangent step followed by a constraints restoration or by a simple arc search again followed by a restoration step.A first draft of this paper was first presented at the 10th International Symposium on Mathematical Programming, Montreal, Canada, 1979. The present version has been prepared while the author was visiting the Department of Engineering-Economic Systems at Stanford University, partially supported by AFOSR Grant No. 77-3141.
Keywords:Nonlinearly constrained optimization  differential geometry  Riemannian manifolds  iterative methods  convergence theorems  rate of convergence
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号