Skew-symmetric matrix

From Wikipedia, the free encyclopedia

In linear algebra, a skew-symmetric (or antisymmetric) matrix is a square matrix A whose transpose is also its negative; that is, it satisfies the equation:

AT = −A

or in component form, if A = (aij):

aij = − aji   for

all i and j.

For example, the following matrix is skew-symmetric:

\begin{bmatrix}
0 & 2 & -1 \\
-2 & 0 & -4 \\
1 & 4 & 0\end{bmatrix}.

Compare this with a symmetric matrix whose transpose is the same as the matrix :AT = A.

Contents

[edit] Properties

Sums and scalar products of skew-symmetric matrices are again skew-symmetric. Hence, the skew-symmetric matrices form a vector space. Its dimension is \tfrac{n\left(n-1\right)}{2}.

If matrix A is skew-symmetric, and B is an arbitrary matrix, then the triple product BTAB is skew-symmetric.

The "skew-symmetric component" of a square matrix A is the matrix B=\tfrac{1}{2}\left(A-A^{T}\right); the "symmetric component" of A is C=\tfrac{1}{2}\left(A+A^{T}\right); the matrix A is the sum of its symmetric and skew-symmetric components.

A is skew-symmetric if and only if xTAx = 0 for all real vectors x.

All main diagonal entries of a skew-symmetric matrix have to be zero, and so the trace is zero.

[edit] The determinant of a skew-symmetric matrix

Let A be a n×n skew-symmetric matrix. The determinant of A satisfies

det(A) = det(AT) = det(−A) = (−1)ndet(A).

In particular, if n is odd the determinant vanishes. This result is called Jacobi's theorem, after Carl Gustav Jacobi (Eves, 1980).

The even-dimensional case is more interesting. It turns out that the determinant of A for n even can be written as the square of a polynomial in the entries of A (Theorem by Thomas Muir):

det(A) = Pf(A)2.

This polynomial is called the Pfaffian of A and is denoted Pf(A). Thus the determinant of a real skew-symmetric matrix is always non-negative.

[edit] Spectral theory

The eigenvalues of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). For a real skew-symmetric matrix the nonzero eigenvalues are all pure imaginary and thus are of the form iλ1, −iλ1, iλ2, −iλ2, … where each of the λk are real.

Real skew-symmetric matrices are normal matrices (they commute with their adjoints) and are thus subject to the spectral theorem, which states that any real skew-symmetric matrix can be diagonalized by a unitary matrix. Since the eigenvalues of a real skew-symmetric matrix are complex it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a block diagonal form by an orthogonal transformation. Specifically, every 2n × 2n real skew-symmetric matrix can be written in the form A = Q Σ QT where Q is orthogonal and

\Sigma = \begin{bmatrix}
\begin{matrix}0 & \lambda_1\\ -\lambda_1 & 0\end{matrix} &  0 & \cdots & 0 \\
0 & \begin{matrix}0 & \lambda_2\\ -\lambda_2 & 0\end{matrix} &  & 0 \\
\vdots &  & \ddots & \vdots \\
0 & 0 & \cdots & \begin{matrix}0 & \lambda_r\\ -\lambda_r & 0\end{matrix} \\
& & & & \begin{matrix}0 \\ & \ddots \\ & & 0 \end{matrix}
\end{bmatrix}

for real λk. The nonzero eigenvalues of this matrix are ±iλk. In the odd-dimensional case Σ always has at least one row and column of zeros.

[edit] Alternating forms

An alternating form φ on a vector space V over a field K is defined (if K doesn't have characteristic 2) to be a bilinear form

φ : V × VK

such that

φ(v,w) = −φ(w,v).

Such a φ will be represented by a skew-symmetric matrix A, φ(v, w) = vTAw, once a basis of V is chosen; and conversely an n×n skew-symmetric matrix A on Kn gives rise to an alternating form sending x to xTAx.

[edit] Infinitesimal rotations

Skew-symmetric matrices form the tangent space to the orthogonal group O(n) at the identity matrix. In a sense, then, skew-symmetric matrices can be thought of as infinitesimal rotations.

Another way of saying this is that the space of skew-symmetric matrices forms the Lie algebra o(n) of the Lie group O(n). The Lie bracket on this space is given by the commutator:

[A,B] = ABBA

It is easy to check that the commutator of two skew-symmetric matrices is again skew-symmetric.

The matrix exponential of a skew-symmetric matrix A is then an orthogonal matrix R:

R=\exp(A)=\sum_{n=0}^\infty \frac{A^n}{n!}.

The image of the exponential map of a Lie algebra always lies in the connected component of the Lie group that contains the identity element. In the case of the Lie group O(n), this connected component is the special orthogonal group SO(n), consisting of all orthogonal matrices with determinant 1. So R = exp(A) will have determinant +1. It turns out that every orthogonal matrix with unit determinant can be written as the exponential of some skew-symmetric matrix.

[edit] See also

[edit] References

[edit] External links