Descent direction
From Wikipedia, the free encyclopedia
In optimization, a descent direction is a vector that, in the sense below, moves us closer towards a local minimum of our objective function .
Suppose we are computing by an iterative method, such as linesearch. We define a descent direction at the kth iterate to be any such that , where denotes the inner product. The motivation for such an approach is that small steps along guarantee that f is reduced, by Taylor's theorem.
Using this definition, the negative of a non-zero gradient is always a descent direction, as .
Numerous methods exist to compute descent directions, all with differing merits. For example, one could use gradient descent or the conjugate gradient method.