摘 要: | This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting. A class of stochastic momentum methods, including stochastic gradient descent, heavy ball and Nesterov’s accelerated gradient,is analyzed in a general framework under mild assumptions. Based on the convergence result of expected gradients, the authors prove the almost sure convergence by a detailed discussion of the effects of momentum and the number of upcrossings. It is worth noting that there are not additional restrictions imposed on the objective function and stepsize.Another improvement over previous results is that the existing Lipschitz condition of the gradient is relaxed into the condition of H¨older continuity. As a byproduct, the authors apply a localization procedure to extend the results to stochastic stepsizes.
|