Sum rule in differentiation

In calculus, the sum rule in differentiation is a method of finding the derivative of a function that is the sum of two other functions for which derivatives exist. This is a part of the linearity of differentiation. The sum rule in integration follows from it. The rule itself is a direct consequence of differentiation from first principles.

The sum rule tells us that for two functions u and v:

\frac{d}{dx}(u + v)=\frac{du}{dx}+\frac{dv}{dx}

This rule also applies to subtraction and to additions and subtractions of more than two functions

\frac{d}{dx}(u + v + w + \dots)=\frac{du}{dx}+\frac{dv}{dx}+\frac{dw}{dx}+\cdots

Proof

Simple Proof

Let h(x) = f(x) + g(x), and suppose that f and g are each differentiable at x. We want to prove that h is differentiable at x and that its derivative h'(x) is given by f'(x)+g'(x).

h'(x) = \lim_{a\to 0} \frac{h(x+a)-h(x)}{a}

 = \lim_{a\to 0} \frac{[f(x+a)+g(x+a)]-[f(x)+g(x)]}{a}
 = \lim_{a\to 0} \frac{f(x+a)-f(x)+g(x+a)-g(x)}{a}
 = \lim_{a\to 0} \frac{f(x+a)-f(x)}{a} + \lim_{a\to 0} \frac{g(x+a)-g(x)}{a}
= f'(x)+g'(x).

More Complicated Proof

Let y be a function given by the sum of two functions u and v, such that:

 y = u + v \,

Now let y, u and v be increased by small increases Δy, Δu and Δv respectively. Hence:

 y + \Delta{y} = (u + \Delta{u}) + (v + \Delta{v}) = u + v + \Delta{u} + \Delta{v} = y + \Delta{u} + \Delta{v}. \,

So:

 \Delta{y} = \Delta{u} + \Delta{v}. \,

Now divide throughout by Δx:

 \frac{\Delta{y}}{\Delta{x}} = \frac{\Delta{u}}{\Delta{x}} + \frac{\Delta{v}}{\Delta{x}}.

Let Δx tend to 0:

 \frac{dy}{dx} = \frac{du}{dx} + \frac{dv}{dx}.

Now recall that y = u + v, giving the sum rule in differentiation:

 \frac{d}{dx}\left(u + v\right) = \frac{du}{dx} + \frac{dv}{dx} .

The rule can be extended to subtraction, as follows:

 \frac{d}{dx}\left(u - v\right) = \frac{d}{dx}\left(u + (-v)\right) = \frac{du}{dx} + \frac{d}{dx}\left(-v\right).

Now use the special case of the constant factor rule in differentiation with k=−1 to obtain:

 \frac{d}{dx}\left(u - v\right) = \frac{du}{dx} + \left(-\frac{dv}{dx}\right) = \frac{du}{dx} - \frac{dv}{dx}.

Therefore, the sum rule can be extended so it "accepts" addition and subtraction as follows:

 \frac{d}{dx}\left(u \pm v\right) = \frac{du}{dx} \pm \frac{dv}{dx}.

The sum rule in differentiation can be used as part of the derivation for both the sum rule in integration and linearity of differentiation.

Generalization to finite sums

Consider a set of functions f1, f2,..., fn. Then

 \frac{d}{dx} \left(\sum_{1 \le i \le n} f_i(x)\right) = \frac{d}{dx}\left(f_1(x) + f_2(x) + \cdots + f_n(x)\right) = \frac{d}{dx}f_1(x) + \frac{d}{dx}f_2(x) + \cdots + \frac{d}{dx}f_n(x)

so

 \frac{d}{dx} \left(\sum_{1 \le i \le n} f_i(x)\right) = \sum_{1 \le i \le n} \left(\frac{d}{dx}f_i(x)\right) .

In other words, the derivative of any finite sum of functions is the sum of the derivatives of those functions.

This follows easily by induction; we have just proven this to be true for n = 2. Assume it is true for all n < k, then define

g(x)=\sum_{i=1}^{k-1} f_i(x).

Then

\sum_{i=1}^k f_i(x)=g(x)+f_k(x)

and it follows from the proof above that

 \frac{d}{dx} \left(\sum_{i=1}^k f_i(x)\right) =  \frac{d}{dx}g(x)+\frac{d}{dx}f_k(x).

By the inductive hypothesis,

\frac{d}{dx}g(x)=\frac{d}{dx} \left(\sum_{i=1}^{k-1} f_i(x)\right)=\sum_{i=1}^{k-1} \frac{d}{dx}f_i(x)

so

\frac{d}{dx} \left(\sum_{i=1}^k f_i(x) \right) = \sum_{i=1}^{k-1} \frac{d}{dx}f_i(x) + \frac{d}{dx}f_k(x)=\sum_{i=1}^k \frac{d}{dx}f_i(x)

which ends the proof of the sum rule of differentiation.

Note this does not automatically extend to infinite sums. An intuitive reason for why things can go wrong is that there is more than one limit involved (specifically, one for the sum and one in the definition of the derivative). Uniform convergence deals with these sorts of issues.

References