Rate of convergence
From Wikipedia, the free encyclopedia
In numerical analysis (a branch of mathematics), the speed at which a convergent sequence approaches its limit is called the rate of convergence. Although strictly speaking, a limit does not give information about any finite first part of the sequence, this concept is of practical importance if we deal with a sequence of successive approximations for an iterative method, as then typically fewer iterations are needed to yield a useful approximation if the rate of convergence is higher. This may even make the difference between needing ten or a million iterations.
Contents |
[edit] The definition of rate of convergence
Suppose that the sequence {xk} converges to the number ξ.
We say that this sequence converges linearly to ξ, if
The number μ is called the rate of convergence.
If (1) holds with μ = 0, then the sequence is said to converge superlinearly. One says that the sequence converges sublinearly if it converges, but (1) does not hold for any μ < 1.
The next definition is used to distinguish superlinear rates of convergence. We say that the sequence converges with order q for q > 1 to ξ if
In particular, convergence with order 2 is called quadratic convergence, and convergence with order 3 is called cubic convergence.
[edit] The extended definition of rate of convergence
The drawback of the above definitions (1) and (2) is that these do not catch some sequences which still converge reasonably fast, but whose "speed" is variable, such as the sequence {bk} below. Therefore, the definition of rate of convergence is sometimes extended as follows.
Under the new definition, the sequence {xk} converges with at least order q if there exists a sequence {εk} such that
and the sequence {εk} converges to zero with order q according to the above "simple" definition.
[edit] Acceleration of convergence
Several methods exist to increase the rate of convergence of a given sequence, i.e. to calculate a new sequence converging faster to the same limit; this may be much less "expensive" than to calculate more terms of the sequence. See for example Aitken's delta-squared process.
[edit] Examples
Consider the following sequences:
The sequence {ak} converges linearly to 0 with rate 1/2. More generally, the sequence Cμk converges linearly with rate μ if |μ| < 1. The sequence {bk} also converges linearly to 0 with rate 1/2 under the extended definition, but not under the simple definition. The sequence {ck} converges superlinearly. In fact, it is quadratically convergent. Finally, the sequence {dk} converges sublinearly.
[edit] References
The simple definition is used in
- Michelle Schatzman (2002), Numerical analysis: a mathematical introduction, Clarendon Press, Oxford. ISBN 0-19-850279-6.
The extended definition is used in
- Kendell A. Atkinson (1988), An introduction to numerical analysis (2nd ed.), John Wiley and Sons. ISBN 0-471-50023-2.
- Walter Gautschi (1997), Numerical analysis: an introduction, Birkhäuser, Boston.
- Endre Süli and David Mayers (2003), An introduction to numerical analysis, Cambridge University Press. ISBN 0-521-00794-1.