首页 | 本学科首页   官方微博 | 高级检索  
     

一个新的无约束优化超记忆梯度算法
引用本文:时贞军. 一个新的无约束优化超记忆梯度算法[J]. 数学进展, 2006, 35(3): 265-274
作者姓名:时贞军
作者单位:曲阜师范大学运筹与管理学院,日照,山东,276826;中国科学院计算数学与科学工程计算研究所,北京,100080
基金项目:中国博士后科学基金;王宽诚教育基金
摘    要:本文提出一种新的无约束优化超记忆梯度算法,算法利用当前点的负梯度和前一点的负梯度的线性组合为搜索方向,以精确线性搜索和Armijo搜索确定步长.在很弱的条件下证明了算法具有全局收敛性和线性收敛速度.因算法中避免了存贮和计算与目标函数相关的矩阵,故适于求解大型无约束优化问题.数值实验表明算法比一般的共轭梯度算法有效.

关 键 词:无约束优化  超记忆梯度算法  全局收敛性  数值实验
文章编号:1000-0917(2006)03-0265-10
收稿时间:2003-06-09
修稿时间:2004-12-29

A New Super-memory Gradient Method for Unconstrained Optimization
SHI Zhen-jun. A New Super-memory Gradient Method for Unconstrained Optimization[J]. Advances in Mathematics(China), 2006, 35(3): 265-274
Authors:SHI Zhen-jun
Affiliation:College of Operations Research and Management, Qufu Normal University, Rizhao, Shandong, 276826,P. R. China;Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, P.O. Box 2719, Beijing, 100080, P. R. China
Abstract:A new super-memory gradient method for unconstrained optimization problem is proposed. The algorithm uses the linear combination of negative gradient and its previous negative gradient as a search direction, and uses exact line search or inexact line search to define the step-size at each iteration. It is suitable to solve large scale unconstrained optimization problems because it avoids the computation and storage of matrices associated with the Hessian of objective functions. The convergence of the algorithm with exact line search is proved. Furthermore, the global convergence is also proved under Armijo line search. Numerical experiments show that the algorithm is efficient in practical computation in many situations.
Keywords:unconstrained optimization  memory gradient method  global convergence  numerical experiment
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号