首页 | 本学科首页   官方微博 | 高级检索  
     检索      

TWO NOVEL GRADIENT METHODS WITH OPTIMAL STEP SIZES
作者姓名:Harry Oviedo  Oscar Dalmau  Rafael Herrera
作者单位:Centro de Investigación en Matemáticas, CIMAT A.C.Guanajuato, Gto.Mexico
基金项目:supported in part by CONACYT(Mexico),Grants 258033,256126.
摘    要:In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems.The proposed step sizes employ second-order information in order to obtain faster gradient-type methods.Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function.A convergence analysis of the proposed algorithm is provided.Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature.Experimentally,it is observed that our proposals accelerate the gradient method at nearly no extra computational cost,which makes our proposal a good alternative to solve large-scale problems.

关 键 词:Gradient  methods  Convex  quadratic  optimization  Hessian  spectral  properties  Steplength  selection

TWO NOVEL GRADIENT METHODS WITH OPTIMAL STEP SIZES
Harry Oviedo,Oscar Dalmau,Rafael Herrera.TWO NOVEL GRADIENT METHODS WITH OPTIMAL STEP SIZES[J].Journal of Computational Mathematics,2021,39(3):375-391.
Authors:Harry Oviedo  Oscar Dalmau  Rafael Herrera
Abstract:In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems.The proposed step sizes employ second-order information in order to obtain faster gradient-type methods.Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function.A convergence analysis of the proposed algorithm is provided.Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature.Experimentally,it is observed that our proposals accelerate the gradient method at nearly no extra computational cost,which makes our proposal a good alternative to solve large-scale problems.
Keywords:Gradient methods  Convex quadratic optimization  Hessian spectral properties  Steplength selection
本文献已被 维普 万方数据 等数据库收录!
点击此处可从《计算数学(英文版)》浏览原始摘要信息
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号