首页 | 本学科首页   官方微博 | 高级检索  
     检索      


On the projected subgradient method for nonsmooth convex optimization in a Hilbert space
Authors:Ya I Alber  A N Iusem  M V Solodov
Institution:(1) Instituto de Matemática Pura e Aplicada, Estrada Dona Castorina 110, CEP 22460-320 Jardim Botânico, Rio de Janeiro, RJ, Brazil;(2) Present address: Department of Mathematics, The Technion — Israel Institute of Technology, 32000 Haifa, Israel
Abstract:We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to anepsi k -subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizesepsi k are exogenously given, satisfyingSgr k=0 infin agrk = infin, Sgr k=0 infin agr k 2 < infin, andepsi k is chosen so thatepsi k les mgragrk for somemgr > 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded otherwise. Among the features of our convergence analysis, we mention that it covers the nonsmooth case, in the sense that we make no assumption of differentiability off, and much less of Lipschitz continuity of its gradient. Also, we prove weak convergence of the whole sequence, rather than just boundedness of the sequence and optimality of its weak accumulation points, thus improving over all previously known convergence results. We present also convergence rate results. © 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.Research of this author was partially supported by CNPq grant nos. 301280/86 and 300734/95-6.
Keywords:Convex optimization  Nonsmooth optimization  Projected gradient method  Steepest descent method  Weak convergence  Convergence rate
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号