Dirichlet's test

From Wikipedia, the free encyclopedia

In mathematics, Dirichlet's test is a method of testing for the convergence of a series and is named after mathematician Johann Dirichlet.

Given two sequences of real numbers, {an} and {bn}, if the sequences satisfy

  • a_n \geq a_{n+1} > 0
  • \lim_{n \rightarrow \infty} a_n = 0
  • \left|\sum^{N}_{n=1}b_n\right|\leq M for every positive integer N

where M is some constant, then the series

\sum^{\infty}_{n=1}a_n b_n

converges.

A corollary to Dirichlet's test is the more commonly used alternating series test for the case

b_n = (-1)^n \implies \sum_{n=1}^\infty b_n \leq 0.

Another corollary is that \sum_{n=1}^\infty a_n \sin n converges whenever {an} is a decreasing sequence that tends to zero.

[edit] References

  • Hardy, G. H., A Course of Pure Mathematics, Ninth edition, Cambridge University Press, 1946. (pp. 379-380).
  • Voxman, William L., Advanced Calculus: An Introduction to Modern Analysis, Marcel Dekker, Inc., New York, 1981. (§8.B.13-15) ISBN 0-8247-6949-X.

[edit] External links

In other languages