Gradient method

From HandWiki

In optimization, a gradient method is an algorithm to solve problems of the form

[math]\displaystyle{ \min_{x\in\mathbb R^n}\; f(x) }[/math]

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.

See also


References

  • Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.