Eigenvalue perturbation
From Wikipedia, the free encyclopedia
This article may require cleanup to meet Wikipedia's quality standards. Please improve this article if you can. (May 2007) |
This article or section is in need of attention from an expert on the subject. Please help recruit one or improve this article yourself. See the talk page for details. Please consider using {{Expert-subject}} to associate this request with a WikiProject |
This article does not cite any references or sources. (April 2007) Please help improve this article by adding citations to reliable sources. Unverifiable material may be challenged and removed. |
Eigenvalue perturbation is a perturbation approach to solving eigenvalues and eigenvectors of systems perturbed from one with known eigenvectors and eigenvalues. It also allows one to determine the sensitivity of the eigenvalues and eigenvectors with respect to changes in the system.
Contents |
[edit] Example
Suppose we have solutions to the generalized eigenvalue problem,
That is, we know λ0i and for . Now suppose we want to change the matrices by a small amount. That is, we want to let
- [K] = [K0] + [δK]
and
- [M] = [M0] + [δM]
where all of the δ terms are much smaller than the corresponding term. We expect answers to be of the form
- λi = λ0i + δλ0i
and
- .
[edit] Steps
We assume that the matrices are symmetric and positive definite and assume we have scaled the eigenvectors such that
where is the Kronecker delta. Define ωi as
Now we want to solve the equation
- .
Substituting, we get
- .
which expands to
-
-
-
- .
-
-
Canceling from (1) leaves
-
-
-
- .
-
-
Removing the higher-order terms, this simplifies to
We note that, when the matrix is symmetric, the unperturbed eigenvectors are orthogonal and so we use them as a basis for the perturbed eigenvectors. That is, we want to construct
where the εij are small constants that are to be determined. Substituting (4) into (3) and rearranging gives
- .
Because the eigenvectors are orthogonal, we can remove the summations by left multiplying by :
- .
The two terms containing εii are equal because left-multiplying (1) by gives
- .
Canceling those terms in (6) leaves
- .
Rearranging gives
But by (2), this denominator is equal to 1. Thus
- ■
Then
- .
To find εii, use
- .
[edit] Summary
and
[edit] Results
This means it is possible to efficiently do a sensitivity analysis on λi as a function of changes in the entries of the matrices. (Recall that the matrices are symmetric and so changing will also change , hence the term.)
and
- .
Similarly
and
- .