Instituto de Matemática Pura e Aplicada, Estrada Dona Castorina 110, Jardim Botânico, Rio de Janeiro, RJ, CEP 22460-320, Brazil
Abstract:
We consider the class of descent algorithms for unconstrained optimization with an Armijo-type stepsize rule in the case when the gradient of the objective function is computed inexactly. An important novel feature in our theoretical analysis is that perturbations associated with the gradient are not assumed to be relatively small or to tend to zero in the limit (as a practical matter, we expect them to be reasonably small, so that a meaningful approximate solution can be obtained). This feature makes our analysis applicable to various difficult problems encounted in practice. We propose a modified Armijo-type rule for computing the stepsize which guarantees that the algorithm obtains a reasonable approximate solution. Furthermore, if perturbations are small relative to the size of the gradient, then our algorithm retains all the standard convergence properties of descent methods.