next up previous contents index
Next: Main features of Up: Scaled Conjugate Gradient Previous: Scaled Conjugate Gradient

Conjugate Gradient Methods (CGMs)

They are general purpose second order techniques that help minimize goal functions of several variables, with sound theoretical foundations [P88,Was95]. Second order means that these methods make use of the second derivatives of the goal function, while first-order techniques like standard backpropagation only use the first derivatives. A second order technique generally finds a better way to a (local) minimum than a first order technique, but at a higher computational cost.

Like standard backpropagation, CGMs iteratively try to get closer to the minimum. But while standard backpropagation always proceeds down the gradient of the error function, a conjugate gradient method will proceed in a direction which is conjugate to the directions of the previous steps. Thus the minimization performed in one step is not partially undone by the next, as it is the case with standard backpropagation and other gradient descent methods.



Niels.Mache@informatik.uni-stuttgart.de
Tue Nov 28 10:30:44 MET 1995