首页 | 本学科首页   官方微博 | 高级检索  
     检索      

无约束优化的超记忆梯度算法及其收敛特征
引用本文:汤京永,董丽.无约束优化的超记忆梯度算法及其收敛特征[J].数学理论与应用,2008,28(4):1-5.
作者姓名:汤京永  董丽
作者单位:信阳师范学院数学与信息科学学院,信阳464000
摘    要:研究一种新的无约束优化超记忆梯度算法,算法在每步迭代中充分利用前面迭代点的信息产生下降方向,利用Wolfe线性搜索产生步长,在较弱的条件下证明了算法的全局收敛性。新算法在每步迭代中不需计算和存储矩阵,适于求解大规模优化问题。

关 键 词:无约束优化  超记忆梯度法  Wolfe线性搜索  全局收敛

Convergence of Super-memory Gradient Method for Unconstrained Optimization
Tang Jingyong Dong Li.Convergence of Super-memory Gradient Method for Unconstrained Optimization[J].Mathematical Theory and Applications,2008,28(4):1-5.
Authors:Tang Jingyong Dong Li
Institution:Tang Jingyong Dong Li(College of Mathematics , Information Science,Xinyang Normal University,Xinyang,464000)
Abstract:The paper presents a new super-memory gradient method for unconstrained optimization problems.This method makes use of the current and previous multi-step iterative information to generate a new search direction and uses Wolfe line search to define the step-size at each iteration.The global convergence is proved under some mild conditions.It is suitable to solve large scale unconstratined optimization prblems because it avoids the computation and storage of some matrices.
Keywords:Unconstrained optimization Super-memory gradient method Wolfe line search Global convergence  
本文献已被 CNKI 维普 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号