首页 | 本学科首页   官方微博 | 高级检索  
     


A New Gradient Method with an Optimal Stepsize Property
Authors:Y. H. Dai  X. Q. Yang
Affiliation:(1) State Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics and Scientific/Engineering Computing, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, 100080, P.R. China;(2) Department of Applied Mathematics, The Hong Kong Polytechnic University, Kowloon, Hong Kong
Abstract:The gradient method for the symmetric positive definite linear system $$Ax=b$$ is as follows

$$x_{k + 1}=x_{k}-alpha_{k} g_{k}$$ (1)
where $$g_{k}=Ax_{k}-b$$ is the residual of the system at xk and αk is the stepsize. The stepsize $$alpha_{k} = frac{2}{{lambda_{1}+lambda_{n}}}$$ is optimal in the sense that it minimizes the modulus $$||I - alpha A||_{2}$$ , where λ1 and λn are the minimal and maximal eigenvalues of A respectively. Since λ1 and λn are unknown to users, it is usual that the gradient method with the optimal stepsize is only mentioned in theory. In this paper, we will propose a new stepsize formula which tends to the optimal stepsize as $$k to infty$$ . At the same time, the minimal and maximal eigenvalues, λ1 and λn, of A and their corresponding eigenvectors can be obtained. This research was initiated while the first author was visiting The Hong Kong Polytechnic University. This author was supported by the Chinese NSF grants (No. 40233029 and 101071104) and an innovation fund of Chinese Academy of Sciences. This author was supported by a grant from the Research Committee of the Hong Kong Polytechnic University (A-PC36).
Keywords:linear system  gradient method  steepest descent method  (shifted) power method
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号