首页 | 本学科首页   官方微博 | 高级检索  
     

Convergence of Gradient Algorithms for Nonconvex C1+α Cost Functions*
作者姓名:Zixuan WANG
摘    要:This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting. A class of stochastic momentum methods, including stochastic gradient descent, heavy ball and Nesterov’s accelerated gradient,is analyzed in a general framework under mild assumptions. Based on the convergence result of expected gradients, the authors prove the almost sure convergence by a detailed discussion of the effects of momentum and the number of upcrossings. It is worth noting that there are not additional restrictions imposed on the objective function and stepsize.Another improvement over previous results is that the existing Lipschitz condition of the gradient is relaxed into the condition of H¨older continuity. As a byproduct, the authors apply a localization procedure to extend the results to stochastic stepsizes.

收稿时间:2021-06-04
修稿时间:2022-01-07
点击此处可从《数学年刊B辑(英文版)》浏览原始摘要信息
点击此处可从《数学年刊B辑(英文版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号