Restart procedures for the conjugate gradient method

The conjugate gradient method is particularly useful for minimizing functions of very many variables because it does not require the storage of any matrices. However the rate of convergence of the algorithm is only linear unless the iterative procedure is “restarted” occasionally. At present it is usual to restart everyn or (n + 1) iterations, wheren is the number of variables, but it is known that the frequency of restarts should depend on the objective function. Therefore the main purpose of this paper is to provide an algorithm with a restart procedure that takes account of the objective function automatically. Another purpose is to study a multiplying factor that occurs in the definition of the search direction of each iteration. Various expressions for this factor have been proposed and often it does not matter which one is used. However now some reasons are given in favour of one of these expressions. Several numerical examples are reported in support of the conclusions of this paper.