Tensor (intrinsic definition)

From Wikipedia, the free encyclopedia

In mathematics, the modern component-free approach to the theory of tensors views tensors initially as abstract objects, expressing some definite type of multi-linear concept. Their well-known properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.

In differential geometry an intrinsic geometric statement may be described by a tensor field on a manifold, and then doesn't need to make references to coordinates at all. The same is true in general relativity, of tensor fields describing a physical property. The component-free approach is also used heavily in abstract algebra and homological algebra, where tensors arise naturally.

Note: This article, which is fairly abstract, requires an understanding of the tensor product of vector spaces without chosen bases. The notion of a tensor product generalizes to vector spaces without chosen bases, and even further, to modules. If you find this article difficult, try reading the main tensor article and the classical or intermediate level treatments first.

Contents

[edit] Definition: Tensor Product of Vector Spaces

Let V and W be two vector spaces over a common field F. Their tensor product

V \otimes_F W

is a vector space over the same field F together with a bilinear map

\otimes:  V \times W \rarr V \otimes W

which is universal (i.e., the smallest possible without throwing away information) in the following sense:

for every vector space X over the field F and every F-bilinear map

Q:  V \times W \rarr X \,

there is a unique F-linear map

Q': V \otimes_F W \rarr X

such that

Q (v,w)  = Q'(v\otimes w) \ \ \forall v \in V, w \in W.

It is easy to see that a vector space V \otimes_F W is unique up to isomorphism if it exists, and we write the instead of a tensor product.

All its properties, except its existence, follow from the abstract definition, although some properties are more easily understood from an explicit model.

An explicit construction is easy to give using a bases {vi} and {wj} for V and W respectively. The tensor product V \otimes W can be constructed as the vector space spanned by a basis

\{ \mathbf{v}_i  \otimes \mathbf{w}_j \}

where in the basis, the symbol \otimes is alternatively seen as a formal symbol for forming a pair, and the value of the bilinear map \otimes on the basis vectors. The extension of \otimes to all of V \times W is done in the unique way compatible with bilinearity.

Alternatively, without using a basis, one can give a construction of the tensor product as follows. Let M be the free abelian group on the set V \times W, and consider the subgroup S generated by all elements of the form

(v_1 + v_2, w) - (v_1,w) - (v_2,w)\
(v, w_1 + w_2) - (v,w_1) - (v,w_2)\
(rv,w) - (v,rw)\

for all v, v_1, v_2 \in V, w, w_1, w_2 \in W, r \in F. The quotient M/S\ is an abelian group, and one equips it with a scalar multiplication canonically via r((v,w) + S) = (rv,w) + S\. The resulting vector space is denoted by V \otimes_F W, and together with the map \otimes: V \times W \rightarrow V\otimes W given by (v,w) \mapsto (v,w) + S, satisfies the universal property stated above. This construction is also the one which generalizes to the case of modules (where in general there may not be a basis, i.e. free).

The elements of this quotient space are termed "tensors". Generally, the shorthand v\otimes w is employed in place of (v,w) + S. This notation can somewhat obscure the fact that tensors are always cosets: manipulations performed via the representatives (v,w) must always be checked that they do not depend on the particular choice of representative.

From the above construction, the following identities are immediate:

(v_1 + v_2) \otimes w = v_1 \otimes w + v_2 \otimes w
v\otimes (w_1 + w_2) = v \otimes w_1 + v \otimes w_2
r(v \otimes w) = (rv) \otimes w = v \otimes (rw)

Tensors of the type v \otimes w, for v \in V, w \in W are called simple tensors. It is not true that all tensors in the tensor space V \otimes W are simple tensors; however, all tensors are finite F-linear combinations of such simple tensors. Generally, v \otimes w \neq w \otimes v; one may symmetrize the algebra by dividing out by the appropriate commutator relation.

If V and W are both finite dimensional then the dimension of V \otimes W is the product of the dimensions of V and W. This tensor product can be repeated to apply to more than just two vector spaces.

A tensor on the vector space V is then defined to be an element of (i.e., a vector in) the following vector space:

V \otimes ... \otimes V \otimes V^* \otimes ... \otimes V^*

where V* is the dual space of V.

If there are m copies of V and n copies of V* in our product, the tensor is said to be of type (m, n) and of contravariant rank m and covariant rank n and total rank m+n. The tensors of rank zero are just the scalars (elements of the field F), those of contravariant rank 1 the vectors in V, and those of covariant rank 1 the one-forms in V* (for this reason the last two spaces are often called the contravariant and covariant vectors).

The (1,1) tensors

V \otimes V^*

are isomorphic in a natural way to the space of linear transformations (i.e., matrices) from V to V. An inner product of a real vector space V; V × V → R corresponds in a natural way to a (0,2) tensor in

V^* \otimes V^*

called the associated metric and usually denoted g.

[edit] Alternate notation

Rather than writing out the full tensor product to denote the space of tensors of type (m,n), the literature often uses the abbreviation

\begin{matrix} T^m_n(V) & = & \underbrace{ V\otimes ... \otimes V} & \otimes  & \underbrace{ V^*\otimes ... \otimes V^*} \\ & & m & & n \end{matrix}

Another, alternate notation for this space is in terms of linear maps from a vector space V to a vector space W. Let

L(V,W)\

denote the space of all linear maps from V to W. Thus, for example, the dual space (the space of 1-forms) may be written as

V^* \approx L(V,\mathbb{R})

The set of (m,n)-tensors can then be written as

T^m_n(V) \approx  L(V^*\otimes ... \otimes V^*\otimes V \otimes ... \otimes V, \mathbb{R}) \approx L^{m+n}(V^*,...,V^*,V,...,V,\mathbb{R})

In the formula above,the roles of V and V* are reversed. In particular, one has

T^1_0(V) \approx L(V^*,\mathbb{R}) \approx V

and

T^0_1(V) \approx L(V,\mathbb{R}) \approx V^*

and

T^1_1(V) \approx L(V,V)

The notation

GL(V,W)\

is often used to denote the space of invertible linear transformations from V to W; however there is no analogous notation for tensor spaces.

[edit] Tensor fields

See main article tensor field

Differential geometry, physics and engineering must often deal with tensor fields on smooth manifolds. The term tensor is in fact sometimes used as a shorthand for tensor field. A tensor field expresses the concept of a tensor that varies from point to point.

[edit] Basis

For any given coordinate system we have a basis {ei} for the tangent space V (this may vary from point-to-point if the manifold is not linear), and a corresponding dual basis {ei} for the cotangent space V* (see dual space). The difference between the raised and lowered indices is there to remind us of the way the components transform.

For example purposes, then, take a tensor A in the space

V \otimes  V \otimes  V^*

The components relative to our coordinate system can be written

\mathbf{A} = A^{ij} {}_k (\mathbf{e}_i \otimes \mathbf{e}_j \otimes \mathbf{e}^k)

Here we used the Einstein notation, a convention useful when dealing with coordinate equations: when an index variable appears both raised and lowered on the same side of an equation, we are summing over all its possible values. In physics we often use the expression

A^{ij} {}_k\

to represent the tensor, just as vectors are usually treated in terms of their components. This can be visualized as an n × n × n array of numbers. In a different coordinate system, say given to us as a basis {ei'}, the components will be different. If (xi'i) is our transformation matrix (note it is not a tensor, since it represents a change of basis rather than a geometrical entity) and if (yii') is its inverse, then our components vary per

A^{i'j'}\! {}_{k'} = x^{i'}\! {}_i \, x^{j'}\! {}_j \, y^k\! {}_{k'} \, A^{ij} {}_k

In older texts this transformation rule often serves as the definition of a tensor. Formally, this means that tensors were introduced as specific representations of the group of all changes of coordinate systems.


/Old Talk - still has some stuff that should likely be merged in

In other languages