Product rule

From Wikipedia, the free encyclopedia

Topics in calculus

Fundamental theorem
Limits of functions
Continuity
Vector calculus
Tensor calculus
Mean value theorem

Differentiation

Product rule
Quotient rule
Chain rule
Implicit differentiation
Taylor's theorem
Related rates
Table of derivatives

Integration

Lists of integrals
Improper integrals
Integration by: parts, disks,
cylindrical shells, substitution,
trigonometric substitution

In mathematics, the product rule of calculus, also called Leibniz's law (see derivation), governs the differentiation of products of differentiable functions.

It may be stated thus:

(fg)'=f'g+fg' \,

or in the Leibniz notation thus:

{d\over dx}(uv)=u{dv\over dx}+v{du\over dx}.

Contents

[edit] Discovery by Leibniz

Discovery of this rule is credited to Leibniz, who demonstrated it using differentials. Here is Leibniz's argument: Let u(x) and v(x) be two differentiable functions of x. Then the differential of uv is

d(uv)\, = (u + du)(v + dv) - uv\,
= u(dv) + v(du) + (du)(dv) \,

Since the term (du)(dv) is "negligible" (i.e. at least quadratic in du and dv), Leibniz concluded that

d(uv) = (du)v + u(dv) \,

and this is indeed the differential form of the product rule. If we divide through by the differential dx, we obtain

\frac{d}{dx} (uv) = \left( \frac{du}{dx} \right) v + u \left( \frac{dv}{dx} \right)

which can also be written in "prime notation" as

(uv)' = u' v + u v' \,

[edit] Examples

  • Suppose one wants to differentiate f(x) = x2 sin(x). By using the product rule, you get the derivative f'(x) = 2x sin(x) + x2cos(x) (since the derivative of x2 is 2x and the derivative of sin(x) is cos(x)).
  • One special case of the product rule is the Constant Multiple Rule which states: if c is a real number and f(x) is a differentiable function, then cf(x) is also differentiable, and its derivative is (c × f)'(x) = c × f '(x). This follows from the product rule since the derivative of any constant is zero. This, combined with the sum rule for derivatives, shows that differentiation is linear.
  • The product rule can be used to derive the rule for integration by parts and the quotient rule.

[edit] A common error

It is a common error, when studying calculus, to suppose that the derivative of (uv) equals (u′)(v′) (Leibniz himself made this error initially); however, it is quite easy to find counterexamples to this. Most simply, take a function f, whose derivative is f '(x). Now that function can also be written as f(x) · 1, since 1 is the identity element for multiplication. Suppose the above-mentioned misconception were true; if so, (u′)(v′) would equal zero. This is true because the derivative of a constant (such as 1) is zero and the product of f '(x) · 0 is also zero.

[edit] Proof of the product rule

A rigorous proof of the product rule can be given using the properties of limits and the definition of the derivative as a limit of Newton's difference quotient:

Suppose

f(x) = g(x)h(x) \,

and suppose further that g and h are each differentiable at the fixed number x. Then

f'(x)\! = \lim_{\Delta x \to 0} \frac{f(x + \Delta x) - f(x)}{\Delta x}
= \lim_{\Delta x \to 0} \frac{g(x + \Delta x)h(x + \Delta x) - g(x)h(x)}{\Delta x}
= \lim_{\Delta x \to 0} \frac{g(x)h(x + \Delta x) - g(x)h(x) + g(x + \Delta x)h(x + \Delta x) - g(x)h(x + \Delta x)}{\Delta x}
= \lim_{\Delta x \to 0} \frac{g(x)(h(x + \Delta x) - h(x)) + h(x + \Delta x)(g(x + \Delta x) - g(x))}{\Delta x}
= \lim_{\Delta x \to 0} \left(g(x)\frac{h(x + \Delta x) - h(x)}{\Delta x} + h(x + \Delta x) \frac{g(x + \Delta x) - g(x)}{\Delta x}\right)

Since h is continuous at x, we have

\lim_{\Delta x \to 0} h(x + \Delta x) = h(x)

and by the definition of the derivative, and the differentiability of g and h at x, we also have

g'(x)=\lim_{\Delta x \to 0} \frac{g(x + \Delta x) - g(x)}{\Delta x}
h'(x)=\lim_{\Delta x \to 0} \frac{h(x + \Delta x) - h(x)}{\Delta x}

Thus, we are justified in splitting each of the products inside the limit, and putting everything together, and we have

f'(x) = \lim_{\Delta x \to 0} \left[g(x)\left(\frac{h(x + \Delta x) - h(x)}{\Delta x}\right) + h(x + \Delta x)\left(\frac{g(x + \Delta x) - g(x)}{\Delta x}\right)\right]
= \left[\lim_{\Delta x \to 0} g(x)\right]\left[\lim_{\Delta x \to 0} \frac{h(x + \Delta x) - h(x)}{\Delta x}\right] + \left[\lim_{\Delta x \to 0} h(x + \Delta x)\right]\left[\lim_{\Delta x \to 0}\frac{g(x + \Delta x) - g(x)}{\Delta x}\right]
= g(x)h'(x) + h(x)g'(x) \,

and this completes the proof.

[edit] Alternate proof

Let u(x) and v(x) be functions differentiable at x, and let f(x) = u(x)v(x). By hypothesis,

f'(x) = u'(x)v(x) + u(x)v'(x)

Therefore,

f'(x) = \left( \lim_{h \to 0} \frac{u(x+h)-u(x)}{h} \right) v(x) + u(x) \left( \lim_{h \to 0} \frac{v(x+h)-v(x)}{h} \right)

Because v(x) is differentiable at x, it follows that v(x) is continuous at x, and therefore, by the definition of continuity,

v(x) = \lim_{h \to 0} v(x + h)

Substituting,

f'(x) = \left( \lim_{h \to 0} \frac{u(x+h)-u(x)}{h} \right) \left( \lim_{h \to 0} v(x + h) \right) + \left( \lim_{h \to 0} u(x) \right) \left( \lim_{h \to 0} \frac{v(x+h)-v(x)}{h} \right)

Therefore,

f'(x) = \lim_{h \to 0} \frac{u(x+h) v(x+h) - u(x) v(x+h) + u(x) v(x+h) - u(x) v(x)}{h}
= \lim_{h \to 0} \frac{u(x+h)v(x+h) - u(x)v(x)}{h}
= \left( u(x)v(x) \right)'

Q.E.D.

[edit] Generalizations

The product rule can be generalised to products of more than two factors. For example, for three factors we have

\frac{d(uvw)}{dx} = \frac{du}{dx}vw + u\frac{dv}{dx}w + uv\frac{dw}{dx}

For a collection of functions f_1 \dots f_k, we can write this more succinctly as

\frac{d}{dx} \prod_{i=1}^k f_i(x)  = \left(\sum_{i=1}^k \frac{\frac{d}{dx} f_i(x)}{f_i(x)}\right)    \prod_{i=1}^k f_i(x)

It can also be generalized to the Leibniz rule for higher derivatives of a product of two factors: if y = uv and y(n) denotes the n-th derivative of y, then

y^{(n)}(x) = \sum_{k=0}^n {n \choose k} u^{(n-k)}(x)\; v^{(k)}(x).

See also binomial coefficient and the formally quite similar binomial theorem.

If X, Y, and Z are Banach spaces (which includes Euclidean space) and B : X × YZ is a continuous bilinear operator. Then B is differentiable, and its derivative at the point (x,y) in X × Y is the linear map D(x,y)B : X × YZ given by

(D_\left( x,y \right)\,B)\left( u,v \right) = B\left( u,y \right) + B\left( x,v \right)\qquad\forall (u,v)\in X \times Y.


[edit] Derivation in abstract algebra

In abstract algebra, the product rule is used to define what is called a derivation, not vice versa.

[edit] See also