Dirichlet's test
From Wikipedia, the free encyclopedia
In mathematics, Dirichlet's test is a method of testing for the convergence of a series and is named after mathematician Johann Dirichlet.
Given two sequences of real numbers, {an} and {bn}, if the sequences satisfy
-
- for every positive integer N
where M is some constant, then the series
converges.
A corollary to Dirichlet's test is the more commonly used alternating series test for the case
- .
Another corollary is that converges whenever {an} is a decreasing sequence that tends to zero.
[edit] References
- Hardy, G. H., A Course of Pure Mathematics, Ninth edition, Cambridge University Press, 1946. (pp. 379-380).
- Voxman, William L., Advanced Calculus: An Introduction to Modern Analysis, Marcel Dekker, Inc., New York, 1981. (§8.B.13-15) ISBN 0-8247-6949-X.