Leibniz formula for determinants

From Wikipedia, the free encyclopedia

In algebra, the Leibniz formula expresses the determinant of a square matrix in terms of permutations of the matrix' elements. Named in honor of Gottfried Leibniz, the formula is

\det(A) = \sum_{\sigma \in S_n} \sgn(\sigma) \prod_{i = 1}^n a_{\sigma(i)}^i

for an n×n matrix, where sgn is the sign function of permutations in the permutation group Sn, which returns +1 and –1 for even and odd permutations, respectively.

Another common notation used for the formula is in terms of the Levi-Civita symbol and makes use of the Einstein summation notation, where it becomes

\det(A)=\epsilon^{i_1\cdots i_n}{A^{1}}_{i_1}\cdots {A^{n}}_{i_n},

which may be more familiar to physicists.

In the sequel, a proof of the equivalence of this formula to the conventional definition of the determinant in terms of expansion by minors is given.

Theorem. There exists exactly one function

 F : \mathfrak M_n (\mathbb K) \longrightarrow \mathbb K

which is alternate multilinear w.r.t. columns and such that F(I) = 1.

Proof.

Let F be such a function, and let A = (a_i^j)_{i = 1, \dots, n}^{j = 1, \dots , n} be an n \times n matrix. Call Aj the j-th column of A, i.e. A^j = (a_i^j)_{i = 1, \dots , n}, so that A = \left(A^1, \dots, A^n\right).

Also, let Ek denote the k-th column vector of the identity matrix.

Now one writes each of the Aj's in terms of the Ek, i.e.

A^j = \sum_{k = 1}^n a_k^j E^k.

As F is multilinear, one has


\begin{align}
F(A)& = F\left(\sum_{k_1 = 1}^n a_{k_1}^1 (E^{k_1}, A^2, \dots, A^n)\right)\\
& = \cdots\\
& = \sum_{k_1, \dots, k_n = 1}^n \prod_{i = 1}^n a_{k_i}^i F((E^{k_1}, \dots, E^{k_n})).
\end{align}

As the above sum takes into account all the possible choices of ordered n-tuples \left(k_1, \dots , k_n\right), it can be expressed in terms of permutations as

\sum_{\sigma \in \mathfrak S_n} \prod_{i = 1}^n a_{\sigma(i)}^i F((E^{\sigma(1)}, \dots , E^{\sigma(n)})).

Now one rearranges the columns of \left(E^{\sigma(1)}, \dots, E^{\sigma(n)}\right) so that it becomes the identity matrix; the number of columns that need to be exchanged is exactly sgn(σ). Hence, thanks to alternance, one finally gets


\begin{align}
F(A)& = \sum_{\sigma \in \mathfrak S_n} \sgn(\sigma) \prod_{i = 1}^n a_{\sigma(i)}^i F(I)\\
& = \sum_{\sigma \in \mathfrak S_n} \sgn(\sigma) \prod_{i = 1}^n a_{\sigma(i)}^i
\end{align}

as F(I) is required to be equal to 1.

Hence the determinant can be defined as the only function

 \det : \mathfrak M_n (\mathbb K) \longrightarrow \mathbb K

which is alternate multilinear w.r.t. columns and such that det(I) = 1.