Processing math: 100%

Tuesday, May 1, 2012

Tensor Products and Algebras

V,W are vector spaces over the field K. A vector space (T,ϕ) over K together with a bilinear map ϕ:V×WT is called a tensor product of V and W if for every vector space U over K and every bilinear map f:V×WU a single linear map h:TU exists such that f=hϕ. The elements of (T,ϕ) are called tensors (singular: tensor).

Let V and W be of finite dimensions. Then even the biggest image set of any map f fits in a vector space U of the dimensions dimVdimW. Say for a map f the images of basis vectors (bi) of V and (ci) of W are linearly independent (we can just construct such a map since a bilinear map can be defined by only the images of basis vectors) and every other image of vectors taken from V×W  is in the span of (f(bi,cj)). (v,w)V×W:v=vibi,w=wici:f(v,w)=viwjf(bi,cj) Now the space T can't be a single dimension smaller than U with dimU=dimVdimW (the smallest vector space in terms of dimension that can "hold" the image of f), because otherwise we would not be able to even find a homomorphism h.

More precise: The sequence (ϕ(bi,cj) has to be linearly independent since we chose an f (we can do so since f can be chosen arbitrarily by definition) such that the sequence f(bi,cj) is linearly independent. Let's assume (ϕ(bi,cj)) is linearly dependent then there exist scalars λij where at least one is not equal zero such that λijϕ(bi,cj)=0. Which yields if we apply h h(λijϕ(bi,cj))=λijh(ϕ(bi,cj))=0 But also it is h(ϕ(bi,cj)=f(bi,cj) by definition.  h(λijϕ(bi,cj))=λijh(ϕ(bi,cj))=0=λijf(bi,cj) Which is a contradiction to our assumption that (f(bi,cj)) is linearly independent. So (f(bi,cj)) must be a linearly independent set as well. Easier it is to see that the dimension of T can't be bigger than the dimension of U because then we would not be able to describe a unique linear map h such that f=hϕ.

Now we can define :V×W(VW:=(T,ϕ)):(v,w)vw:=ϕ(v,w) and say: If (bi,cj) is a basis of V×W then (bicj) is a basis of VW.

The existence of tensor products is more or less easy to see, at least in this case where only finite dimensional vector spaces treated. We just construct a vector space T of the dimension dimVdimW over the field K and define a bilinear map by defining the images bicj of the basis vectors (bi,cj) of V×W. Already we can find for any K-vector space U and any bilinear map f:V×WU a linear map such that f=h. It is just required to set h(bici):=f(bi,cj)i,j. Note that h is unambiguously defined by the images of basis vectors of T. So far it can also be seen that T can be any vector space over K if it just has the required dimension. 

The tensor product VW exists and every other tensor product of V×W is isomorphic to VW.

And since the difference of all tensor products lies in an isomorphism we can write for vectors spaces V1,V2,V3 over KV1(V2V3)(V1V2)V3=:V1V2V3 For N vector spaces Vi over K we define NVi:=V1V2VN 0V:=K

Now we show:
For any vector space U over K and any N linear map f:Ni=1ViU a single homomorphism h:NViU exists such that f=h (where :Ni=1ViNVi is of course N linear).

We know that for any bilinear map f:N1Vi×VNU a unique homomorphism h:NViU such that f=h. Both maps f and f are unambiguously defined by their images of a basis of their inverse image spaces. When we look more closely we realize that both maps require exactly Ni=1dimVi images. Which means that for any map f we can find a bilinear map f such that for all vectors (v1,v2,...,vN)Ni=1 f(v1,v2,...,vN)=f(v1v2vN1,vN)=h(v1v2vN)

Further we define nmV:=(nV)(mV) and call the tensor product nmV n times contravariant and m times covariant.

So the tensor product 00V=K would be the space of all scalars, 10V=V the space of all vectors and 01V=V the space of all linear forms VK.

Let I be an index set Vi for all iI a family of vector spaces of K then iIVi:={(vi)iIiIVi|almost all vi=0} is called the direct sum of the vector spaces Vi (see wikipedia).

The sets T(V):=p0i=0iV=KV(VV)T(V,V):=p0i=0,j=0ijV together with the the multiplication for v=(vi),w=(wi)T(V) and a=(aij),b=(bij)T(V,V)vw:=(i+j=nviwj)n0ab=(i+j=m,k+l=naijbkl)m,n0 form algebras. T(V) is called tensor algebra and T(V,V) mixed tensor algebra. 
The neutral element is 1=(1K,0,0,).