Inverse eigenvalues theorem

From Wikipedia, the free encyclopedia

In numerical analysis and linear algebra, the Inverse eigenvalues theorem states that given A nonsingular, |lambda|>0 λ is an eigenvalue of A if and only if λ − 1 is an eigenvalue of A − 1.