Jacobi method

From Wikipedia, the free encyclopedia

The Jacobi method is an algorithm in linear algebra for determining the solutions of a system of linear equations with largest absolute values in each row and column dominated by the diagonal element. Each diagonal element is solved for, and an approximate value plugged in. The process is then iterated until it converges. This algorithm is a stripped-down version of the Jacobi transformation method of matrix diagonalization. The method is named after German mathematician Carl Gustav Jakob Jacobi.

We seek the solution to set of linear equations, written in matrix terms as

 A x = b.\,

Let  A = D+\left({L + U} \right), where D, L, and U represent the diagonal, lower triangular, and upper triangular parts of the coefficient matrix A. Then the equation above can be rephrased as:

 D x+\left({L + U} \right)x = b.

Moreover,

 x = D^{ - 1} \left[b -\left({L + U} \right)x \right],

if a_{ii}\neq 0 for each i. By iterative rule, the definition of the Jacobi method can be expressed as :

 
x^{(k+1)}  = D^{ - 1} \left[b-\left({L + U} \right)x^{(k)}\right],

where k is the iteration count. Often an element-based approach is used:

 
x^{(k+1)}_i  = \frac{1}{a_{ii}} \left(b_i -\sum_{j\ne i}a_{ij}x^{(k)}_j\right),\, i=1,2,\ldots,n.

Note that the computation of x^{(k+1)}_i requires each element in x^{(k)}\, except itself. Then, unlike in the Gauss–Seidel method, we can't overwrite x^{(k)}_i\, with x^{(k+1)}_i, as that value will be needed by the rest of the computation. This is the most meaningful difference between the Jacobi and Gauss–Seidel methods. The minimum amount of storage is two vectors of size n, and explicit copying will need to take place.

Contents

[edit] Algorithm

Choose an initial guess x0 to the solution

for k := 1 step 1 until convergence do
for i := 1 step until n do
σ = 0
for j := 1 step until n do
if j != i then
 \sigma  = \sigma  + a_{ij} x_j^{(k-1)}
end if
end (j-loop)
  x_i^{(k)}  = {{\left( {b_i  - \sigma } \right)} \over {a_{ii} }}
end (i-loop)
check if convergence is reached
end (k-loop)

[edit] Convergence

The method will always converge if the matrix A is strictly or irreducibly diagonally dominant. Strict row diagonal dominance means that for each row, the absolute value of the diagonal term is greater than the sum of absolute values of other terms:

\left | a_{ii} \right | > \sum_{i \ne j} {\left | a_{ij} \right |}.

The Jacobi method sometimes converges even if this condition is not satisfied. It is necessary, however, that the diagonal terms in the matrix are greater (in magnitude) than the other terms.

[edit] See also

[edit] External links