Gradient method

From HandWiki
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

In optimization, a gradient method is an algorithm to solve problems of the form

[math]\displaystyle{ \min_{x\in\mathbb R^n}\; f(x) }[/math]

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.

See also


References

  • Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.