Matrix theory

From Wikipedia, the free encyclopedia

Matrix theory is a branch of mathematics which focuses on the study of matrices. Initially a sub-branch of linear algebra, it has grown to cover subjects related to graph theory, algebra, combinatorics, and statistics as well.

Contents

[edit] History and Overview

The term matrix was first coined in 1848 by J.J. Sylvester as a name of an array of numbers. In 1855, Arthur Cayley introduced matrix as a representation of linear transformation. This period was considered as the beginning of linear algebra and matrix theory. The study of vector space over finite field, a branch of linear algebra which is useful in coding theory, naturally leads to the study and use of matrices over finite field in coding theory.

Module is a generalization of vector space. It could be considered as a vector space over ring. It leads to the study of matrices over ring. Matrix theory in this area is not often considered as a branch of linear algebra. Among the results listed in Useful theorems, Cayley-Hamilton Theorem is valid if the underlying ring is commutative, Smith normal form is valid if the underlying ring is a principal ideal domain, but others are valid for only matrices over complex numbers or real numbers.

The motivation of linear algebra and the first use of matrices, is the study of systems of linear equations. Related concepts such as determinant and Gaussian elimination, exist long before the introduction of the idea of matrices, now are part of the matrix theory.

Magic squares and Latin squares, two ancient branches of recreational mathematics, are now reformulated using the language of matrices. The link between Latin square and coding theory demonstrates that it is not merely a coincidence. If these two branches are taken into account, we can push back the origin of matrix theory to as far as 650 BC.

With the advance of computer technology, we are now able to solve a system of a large number of linear equations, not just in theory. John von Neumann and Herman Goldstine introduced condition numbers in analyzing round-off errors in 1947. Later, different techniques to calculation, multiplication or factorization of matrices were invented, such as the Fast Fourier Transform.

Payoff matrix in game theory, another field pioneered by John von Neumann, might be the first use of matrices in economics.

Simplex algorithm, a technique involving the operations of matrices of very large size, is used to solve operations research problems, a field strongly related to economics. Flow network problem, a branch of both graph theory and linear programming, can be solved using simplex algorithm -- but there are other more efficient methods.

There are other uses of matrices in graph theory. For example, the adjacency matrix is a representation of undirected graph.

The adjacency matrix is a type of nonnegative matrix. Permutation matrices, the matrix representation of permutations in combinatorics, are also nonnegative matrices. Another important matrix in combinatorics is the Hadamard matrix.

A useful type of nonnegative matrices are the stochastic matrices and doubly stochastic matrices. Stochastic matrices are useful in the study of stochastic processes, in probability theory and in statistics. The evaluation of an enormous stochastic matrix is the central idea behind the PageRank algorithm used by Google. It is worth to state that each doubly stochastic matrix is a convex combination of permutation matrices.

Another important tool in statistics is the correlation matrix.

Computer graphics also involve heavy computation of matrices. For example, a search for a way to minimize the memory needed for best quality of graphics.

For optimization problems involving multi-variable real-value functions, Positive-definite matrices occur in the search for maxima and minima.

There are also practical uses for matrices over arbitrary rings (see Matrix ring). In particular, matrices over polynomial rings are used in control theory.

On the pure mathematics side, matrix rings can provide a rich field of counterexamples for mathematical conjectures, amongst other uses. The square matrices also plays a special role, because the n×n matrices for fixed n have many closure properties.

Further, on the mathematics side, in an nxn' matrix, there are natural rows and columns which define the matrix to actually be n x n. If a matrix can be reduced down to its simplest form where the natural rows do not = n and the natural columns do not = n than that matrix is not square or n x n.

[edit] Useful theorems

[edit] See also

  • List of matrices. This list is a rich source of information and links to a very wide variety of matrices from mathematics, science and engineering.

[edit] References

[edit] External links