Fermat's theorem (stationary points)

From Wikipedia, the free encyclopedia

Fermat's theorem is a theorem in real analysis, named after Pierre de Fermat. It gives a method to find local maxima and minima of differentiable functions by showing that every local extremum of the function is a stationary point (the function derivative is zero in that point). So, by using Fermat's theorem, the problem of finding a function extremum is reduced to solving an equation.

It is important to note that Fermat's theorem gives only a necessary condition for extreme function values. That is, some stationary points are not extreme values, they are inflection points. To check if a stationary point is an extreme value and to further distinguish between a function maximum and a function minimum it is necessary to analyse the second derivative (if it exists).

Contents

[edit] Fermat's theorem

Let f : (a,b) → R be a function and suppose that x0 in (a,b) is a local extremum of f. If f is differentiable at x0 then \frac{df}{dx}(x_0) = 0.

[edit] Intuition

We give the intuition for a function maximum, the reasoning being similar for a function minimum. If x0 in (a,b) is a local maximum then there is a (possibly small) neighborhood of x0 such as the function is increasing before and decreasing after x0. As the derivative is positive for an increasing function and negative for a decreasing function, is positive before and negative after x0. doesn't skip values (by Darboux's theorem), so it has to be zero at some point between the positive and negative values. The only point in the neighbourhood where it is possible to have (x) = 0 is x0.

Note that the theorem (and its proof below) is more general than the intuition in that it doesn't require the function to be differentiable over a neighbourhood around x0. As stated in the theorem, it is sufficient for the function to be differentiable only in the extreme point.

[edit] Proof

Suppose that x0 is a local maximum (a similar proof applies if x0 is a local minimum). Then there exists δ > 0 such that (x0 - δ,x0 + δ) is a subset of (a,b) and such that we have f(x0) ≥ f(x) for all x with |x - x0| < δ. Hence for any h in (0,δ) we notice that it holds

\frac{f(x_0+h) - f(x_0)}{h} \le 0.

Since the limit of this ratio as h → 0 from above exists and is equal to (x0) we conclude that (x0) ≤ 0. On the other hand for h in (-δ,0) we notice that

\frac{f(x_0+h) - f(x_0)}{h} \ge 0

but again the limit as h → 0 from below exists and is equal to (x0) so we also have (x0) ≥ 0.

Hence we conclude that (x0) = 0.

[edit] See also

[edit] External links

In other languages