Leibniz integral rule

From Wikipedia, the free encyclopedia

See Leibniz's rule, a disambiguation page, for other meanings of this term.

In mathematics, Leibniz's rule for differentiation under the integral sign, named after Gottfried Leibniz, tells us that if we have an integral of the form

\int_{y_0}^{y_1} f(x, y) \,dy

then for x \in (x_0, x_1) the derivative of this integral is thus expressible

{d\over dx}\, \int_{y_0}^{y_1} f(x, y) \,dy = \int_{y_0}^{y_1} {\partial \over \partial x} f(x,y)\,dy

provided that f and \partial f / \partial x are both continuous over a region in the form

[x_0,x_1]\times[y_0,y_1].

[edit] Proof

The proof is straightforward: let us first make the assignment

u(x) = \int_{y_0}^{y_1} f(x, y) \,dy.

Then

{d\over dx} u(x) = \lim_{h\rightarrow 0} {u(x+h)-u(x) \over h}.

Substituting back

= \lim_{h\rightarrow 0} {\int_{y_0}^{y_1}f(x+h,y)\,dy-\int_{y_0}^{y_1}f(x,y)\,dy \over h}.

Since integration is linear, we can write the two integrals as one:

= \lim_{h\rightarrow 0} {\int_{y_0}^{y_1}(f(x+h,y)-f(x,y))\,dy\over h}.

And we can take the constant inside, with the integrand

= \lim_{h\rightarrow 0} \int_{y_0}^{y_1} {f(x+h,y)-f(x,y)\over h}\,dy.

And now, since the integrand is in the form of a difference quotient:

=  \int_{y_0}^{y_1} {\partial \over \partial x} f(x,y)\,dy

which can be justified by uniform continuity, so

= {d\over dx} u(x) = {d\over dx} \int_{y_0}^{y_1} f(x, y) \,dy.

[edit] Alternate form

For a monovariant function g:

{d\over dx}\, \int_{f_1(x)}^{f_2(x)} g(t) \,dt = g(f_2(x)) {f_2'(x)} -  g(f_1(x)) f_1'(x)

Or

{d\over dq}\, \int_{a(q)}^{b(q)} g(t,q) \,dt = g(b(q),q) {b'(q)} -  g(a(q),q) a'(q) + \int_{a(q)}^{b(q)} {\partial \over \partial q}  g(t,q) \,dt

This formula can be used to demonstrate that an estimator is complete.

In other languages