Globally Convergent Inexact Generalized Newton Method for First-Order Differentiable Optimization Problems |
| |
Authors: | Pu D. Zhang J. |
| |
Affiliation: | (1) Institute of Mathematics, Shanghai Tiedao University, Shanghai, China;(2) Department of Mathematics, City University of Hong Kong, Hong Kong, China |
| |
Abstract: | Motivated by the method of Martinez and Qi (Ref. 1), we propose in this paper a globally convergent inexact generalized Newton method to solve unconstrained optimization problems in which the objective functions have Lipschitz continuous gradient functions, but are not twice differentiable. This method is implementable, globally convergent, and produces monotonically decreasing function values. We prove that the method has locally superlinear convergence or even quadratic convergence rate under some mild conditions, which do not assume the convexity of the functions. |
| |
Keywords: | nonsmooth optimization inexact Newton methods generalized Newton methods global convergence superlinear rate |
本文献已被 SpringerLink 等数据库收录! |
|