Product rule

From Wikipedia, the free encyclopedia

Topics in calculus

Fundamental theorem
Limits of functions
Continuity
Vector calculus
Matrix calculus
Mean value theorem

Differentiation

Product rule
Quotient rule
Chain rule
Implicit differentiation
Taylor's theorem
Related rates
List of differentiation identities

Integration

Lists of integrals
Improper integrals
Integration by:
parts, disks, cylindrical
shells
, substitution,
trigonometric substitution,
partial fractions, changing order

In calculus, the product rule also called Leibniz's law (see derivation), governs the differentiation of products of differentiable functions.

It may be stated thus:

(fg)'=f'g+fg' \,

or in the Leibniz notation thus:

{d\over dx}(uv)=u{dv\over dx}+v{du\over dx}.

Contents

[edit] Discovery by Leibniz

Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials. Here is Leibniz's argument: Let u(x) and v(x) be two differentiable functions of x. Then the differential of uv is


\begin{align}
d(uv) & {} = (u + du)(v + dv) - uv \\
& {} = u\,dv + v\,du + du\,dv.
\end{align}

Since the term (du)(dv) is "negligible" (i.e. at least quadratic in du and dv), Leibniz concluded that

d(uv) = v\,du + u\,dv \,

and this is indeed the differential form of the product rule. If we divide through by the differential dx, we obtain

\frac{d}{dx} (uv) = v \left( \frac{du}{dx} \right) + u \left( \frac{dv}{dx} \right)

which can also be written in "prime notation" as

(uv)' = v u' + u v'. \,

[edit] Examples

  • Suppose one wants to differentiate f(x) = x2 sin(x). By using the product rule, you get the derivative f'(x) = 2x sin(x) + x2cos(x) (since the derivative of x2 is 2x and the derivative of sin(x) is cos(x)).
  • One special case of the product rule is the constant multiple rule which states: if c is a real number and f(x) is a differentiable function, then cf(x) is also differentiable, and its derivative is (c × f)'(x) = c × f '(x). This follows from the product rule since the derivative of any constant is zero. This, combined with the sum rule for derivatives, shows that differentiation is linear.
  • The product rule can be used to derive the rule for integration by parts and (weak version of) the quotient rule. (It is a "weak" version in that it does not prove that the quotient is differentiable, but only says what its derivative is if it is differentiable.)

[edit] A common error

It is a common error, when studying calculus, to suppose that the derivative of (uv) equals (u′)(v′) (Leibniz himself made this error initially); however, it is quite easy to find counterexamples to this. Most simply, take a function f(x), whose derivative is f '(x). Now that function can also be written as f(x) · 1, since 1 is the identity element for multiplication. Suppose the above-mentioned misconception were true; if so, (u′)(v′) would equal zero. This is true because the derivative of a constant (such as 1) is zero and the product of f '(x) · 0 is also zero.

[edit] Proof of the product rule

A rigorous proof of the product rule can be given using the properties of limits and the definition of the derivative as a limit of Newton's difference quotient.

Suppose

 h(x) = f(x)g(x),\,

and that f and g are each differentiable at the fixed number x. Then

h'(x) = \lim_{w\to x}{ h(w) - h(x) \over w - x} = \lim_{w\to x}{f(w)g(w) - f(x)g(x) \over w - x}. \qquad\qquad(1)

Now the difference

 f(w)g(w) - f(x)g(x)\qquad\qquad(2)

is the area of the big rectangle minus the area of the small rectangle in the illustration.

That L-shaped region can be split into two rectangles, the sum of whose areas is readily seen to be

 f(x) \Bigg( g(w) - g(x) \Bigg) + g(w)\Bigg( f(w) - f(x) \Bigg).\qquad\qquad(3)

(The illustration disagrees with some special cases, since f(w) need not actually be bigger than f(x) and g(w) need not actually be bigger than g(x). Nonetheless, the equality of (2) and (3) is easily checked by algebra.)

Therefore the expression in (1) is equal to

\lim_{w\to x}\left( f(x) \left( {g(w) - g(x) \over w - x} \right) + g(w)\left( {f(w) - f(x) \over w - x} \right) \right).\qquad\qquad(4)

If all four of the limits in (5) below exist, then the expression in (4) is equal to

 \left(\lim_{w\to x}f(x)\right) \left(\lim_{w\to x} {g(w) - g(x) \over w - x}\right)
+ \left(\lim_{w\to x} g(w)\right) \left(\lim_{w\to x} {f(w) - f(x) \over w - x} \right).
\qquad\qquad(5)

Now

\lim_{w\to x}f(x) = f(x)\,

because f(x) remains constant as wx;

 \lim_{w\to x} {g(w) - g(x) \over w - x} = g'(x)

because g is differentiable at x;

 \lim_{w\to x} {f(w) - f(x) \over w - x} = f'(x)

because f is differentiable at x;

and now the "hard" one:

 \lim_{w\to x} g(w) = g(x)\,

because g is continuous at x. How do we know g is continuous at x? Because another theorem says differentiable functions are continuous.

We conclude that the expression in (5) is equal to

 f(x)g'(x) + g(x)f'(x). \,

[edit] Alternative proof: using logarithms

Let f = uv and suppose u and v are positive. Then

\ln f = \ln u + \ln v.\,

Differentiating both sides:

{1 \over f} {d \over dx} f = {1 \over u} {d \over dx} u + {1 \over v} {d \over dx} v

and so, multiplying the left side by f, and the right side by uv,

{d \over dx} f = v {d \over dx} u + u {d \over dx} v.

The proof appears in [1]. Note that since u, v need to be continuous, the assumption on positivity does not diminish the generality.

This proof relies on the chain rule and on the properties of the natural logarithm function, both of which are deeper than the product rule. From one point of view, that is a disadvantage of this proof. On the other hand, the simplicity of the algebra in this proof perhaps makes it easier to understand than a proof using the definition of differentiation directly.

[edit] Alternative proof: using the chain rule

The product rule can be considered a special case of the chain rule for several variables.

 {d (ab) \over dx} = \frac{\partial(ab)}{\partial a}\frac{da}{dx}+\frac{\partial (ab)}{\partial b}\frac{db}{dx} = b \frac{da}{dx} + a \frac{db}{dx}.

[edit] Generalizations

[edit] A product of more than two factors

The product rule can be generalized to products of more than two factors. For example, for three factors we have

\frac{d(uvw)}{dx} = \frac{du}{dx}vw + u\frac{dv}{dx}w + uv\frac{dw}{dx}.

For a collection of functions f_1, \dots, f_k, we have

\frac{d}{dx} \prod_{i=1}^k f_i(x)
 = \left(\sum_{i=1}^k \frac{\frac{d}{dx} f_i(x)}{f_i(x)}\right)
   \prod_{i=1}^k f_i(x).

[edit] Higher derivatives

It can also be generalized to the Leibniz rule for higher derivatives of a product of two factors: if y = uv and y(n) denotes the n-th derivative of y, then

y^{(n)}(x) = \sum_{k=0}^n {n \choose k} u^{(n-k)}(x)\; v^{(k)}(x).

See also binomial coefficient and the formally quite similar binomial theorem. See also Leibniz rule (generalized product rule).

[edit] Higher partial derivatives

For partial derivatives, we have

{\partial^n \over \partial x_1\,\cdots\,\partial x_n} (uv)
= \sum_S {\partial^{|S|} u \over \prod_{i\in S} \partial x_i} \cdot {\partial^{n-|S|} v \over \prod_{i\not\in S} \partial x_i}

where the index S runs through the whole list of 2n subsets of {1, ..., n}. If this seems hard to understand, consider the case in which n = 3:

\begin{align} &{}\quad {\partial^3 \over \partial x_1\,\partial x_2\,\partial x_3} (uv)  \\  \\
&{}= u \cdot{\partial^3 v \over \partial x_1\,\partial x_2\,\partial x_3} + {\partial u \over \partial x_1}\cdot{\partial^2 v \over \partial x_2\,\partial x_3} +  {\partial u \over \partial x_2}\cdot{\partial^2 v \over \partial x_1\,\partial x_3} + {\partial u \over \partial x_3}\cdot{\partial^2 v \over \partial x_1\,\partial x_2} \\  \\
&{}\qquad + {\partial^2 u \over \partial x_1\,\partial x_2}\cdot{\partial v \over \partial x_3}
+ {\partial^2 u \over \partial x_1\,\partial x_3}\cdot{\partial v \over \partial x_2}
+ {\partial^2 u \over \partial x_2\,\partial x_3}\cdot{\partial v \over \partial x_1}
+ {\partial^3 u \over \partial x_1\,\partial x_2\,\partial x_3}\cdot v. \end{align}

[edit] A product rule in Banach spaces

If X, Y, and Z are Banach spaces (which includes Euclidean space) and B : X × YZ is a continuous bilinear operator. Then B is differentiable, and its derivative at the point (x,y) in X × Y is the linear map D(x,y)B : X × YZ given by

 (D_\left( x,y \right)\,B)\left( u,v \right) = B\left( u,y \right) + B\left( x,v \right)\qquad\forall (u,v)\in X \times Y.

[edit] Derivations in abstract algebra

In abstract algebra, the product rule is used to define what is called a derivation, not vice versa.

[edit] For vector functions

For the product rule regarding vector functions, where the result of the function is a vector, the product rule changes somewhat due to the anticommutative properties of vector products (multiplying vectors and getting a vector as a product). Here, the product rule must be calculated as

(fg)'=f'g+fg' \,

and not

(fg)'=f'g+g'f \,, even though this would be correct for multiplication of scalars.

[edit] An application

Among the applications of the product rule is a proof that

 {d \over dx} x^n = nx^{n-1}

when n is a positive integer (this rule is true even if n is not positive or is not an integer, but the proof of that must rely on other methods). The proof is by mathematical induction on the exponent n. If n = 0 then xn is constant and nxn − 1 = 0. The rule holds in that case because the derivative of a constant function is 0. If the rule holds for any particular exponent n, then for the next value, n + 1, we have

\begin{align}
{d \over dx}x^{n+1} &{}= {d \over dx}\left( x^n\cdot x\right) \\  \\
&{}= x{d \over dx} x^n + x^n{d \over dx}x \qquad\mbox{(the product rule is used here)} \\  \\
&{}= x\left(nx^{n-1}\right) + x^n\cdot 1\qquad\mbox{(the induction hypothesis is used here)} \\  \\
&{}= (n + 1)x^n.
\end{align}

Therefore if the proposition is true of n, it is true also of n + 1.

[edit] See also