Abstract: | This paper studies the behaviour of a family of conjugate gradientoptimization algorithms, of which the best known is probablythat introduced in 1964 by Fletcher & Reeves. This familyhas the property that, on a quadratic function, the directionsgenerated by any member of the family are the same set of conjugatedirections providing that, at each iteration, an exact linearsearch is performed. In this paper a modification is introduced that enables thisset of conjugate directions to be generated without any accurateline searches. This enables the minimum of a quadratic functionto be found in, at most, (n+2) gradient evaluations. As themodification only requires the storage of two additional n-vectors,the storage advantage of conjugate gradient algorithms viz-?-vizvariable metric algorithms is maintained. Finally, a numerical study is reported in which the performanceof this new method is compared to that of various members ofthe unmodified family. |