Tuesday, May 1, 2012

Tensor Products and Algebras

\(V,W\) are vector spaces over the field \(K\). A vector space \((T, \phi)\) over \(K\) together with a bilinear map \(\phi:V\times W\rightarrow T\) is called a tensor product of \(V\) and \(W\) if for every vector space \(U\) over \(K\) and every bilinear map \(f:V\times W\rightarrow U\) a single linear map \(h:T\rightarrow U\) exists such that \(f=h\circ \phi\). The elements of \((T,\phi)\) are called tensors (singular: tensor).

Let \(V\) and \(W\) be of finite dimensions. Then even the biggest image set of any map \(f\) fits in a vector space \(U\) of the dimensions \(\mathrm{dim}V\mathrm{dim}W\). Say for a map \(f\) the images of basis vectors \((b^i)\) of \(V\) and \((c^i)\) of \(W\) are linearly independent (we can just construct such a map since a bilinear map can be defined by only the images of basis vectors) and every other image of vectors taken from \(V\times W\)  is in the span of \(\left<(f(b^i,c^j))\right>\). \[\forall (v,w)\in V\times W:v=v_ib^i, w=w_ic^i:f(v,w)=v_iw_jf(b^i,c^j)\] Now the space \(T\) can't be a single dimension smaller than \(U\) with \(\mathrm{dim}U=\mathrm{dim}V\mathrm{dim}W\) (the smallest vector space in terms of dimension that can "hold" the image of \(f\)), because otherwise we would not be able to even find a homomorphism \(h\).

More precise: The sequence \((\phi(b^i,c^j)\) has to be linearly independent since we chose an \(f\) (we can do so since f can be chosen arbitrarily by definition) such that the sequence \(f(b^i,c^j)\) is linearly independent. Let's assume \((\phi(b^i,c^j))\) is linearly dependent then there exist scalars \(\lambda_{ij}\) where at least one is not equal zero such that \(\lambda_{ij}\phi(b^i,c^j)=0\). Which yields if we apply \(h\) \[h(\lambda_{ij}\phi(b^i,c^j))=\lambda_{ij}h(\phi(b^i,c^j))=0\] But also it is \(h(\phi(b^i,c^j)=f(b^i,c^j)\) by definition.  \[h(\lambda_{ij}\phi(b^i,c^j))=\lambda_{ij}h(\phi(b^i,c^j))=0=\lambda_{ij}f(b^i,c^j)\] Which is a contradiction to our assumption that \((f(b^i,c^j))\) is linearly independent. So \((f(b^i,c^j))\) must be a linearly independent set as well. Easier it is to see that the dimension of \(T\) can't be bigger than the dimension of \(U\) because then we would not be able to describe a unique linear map \(h\) such that \(f=h\circ \phi\).

Now we can define \[\otimes:V\times W\rightarrow (V\otimes W:=(T,\phi)):(v,w)\mapsto v\otimes w:=\phi(v,w)\] and say: If \((b^i,c^j)\) is a basis of \(V\times W\) then \((b^i\otimes c^j)\) is a basis of \(V\otimes W\).

The existence of tensor products is more or less easy to see, at least in this case where only finite dimensional vector spaces treated. We just construct a vector space \(T\) of the dimension \(\mathrm{dim}V\mathrm{dim}W\) over the field \(K\) and define a bilinear map \(\otimes\) by defining the images \(b^i\otimes c^j\) of the basis vectors \((b^i,c^j)\) of \(V\times W\). Already we can find for any \(K\)-vector space \(U\) and any bilinear map \(f:V\times W\rightarrow U\) a linear map such that \(f=h\circ \otimes\). It is just required to set \(h(b^i\otimes c^i):=f(b^i,c^j)\forall i,j\). Note that \(h\) is unambiguously defined by the images of basis vectors of \(T\). So far it can also be seen that \(T\) can be any vector space over \(K\) if it just has the required dimension. 

The tensor product \(V\otimes W\) exists and every other tensor product of \(V\times W\) is isomorphic to \(V\otimes W\).

And since the difference of all tensor products lies in an isomorphism we can write for vectors spaces \(V_1, V_2, V_3\) over \(K\)\[V_1\otimes (V_2\otimes V_3)\cong  (V_1\otimes V_2)\otimes V_3 =: V_1\otimes V_2\otimes V_3\] For \(N\) vector spaces \(V_i\) over \(K\) we define \[\otimes^NV_i:=V_1\otimes V_2\otimes\dots\otimes V_N\] \[\otimes^0V:=K\]

Now we show:
For any vector space \(U\) over \(K\) and any \(N\) linear map \(f:\prod_{i=1}^NV_i\rightarrow U\) a single homomorphism \(h:\otimes^N V_i\rightarrow U\) exists such that \(f=h\circ \otimes\) (where \(\otimes:\prod_{i=1}^NV_i\rightarrow \otimes^NV_i\) is of course \(N\) linear).

We know that for any bilinear map \(f':\otimes^{N-1}V_i\times V_N\rightarrow U\) a unique homomorphism \(h:\otimes^NV_i\rightarrow U\) such that \(f'=h\circ \otimes\). Both maps \(f\) and \(f'\) are unambiguously defined by their images of a basis of their inverse image spaces. When we look more closely we realize that both maps require exactly \(\prod_{i=1}^N\mathrm{dim}V_i\) images. Which means that for any map \(f\) we can find a bilinear map \(f'\) such that for all vectors \((v_1,v_2,...,v_N)\in\prod_{i=1}^N\) \[f(v_1,v_2,...,v_N)=f'(v_1\otimes v_2\otimes\dots\otimes v_{N-1}, v_N)=h(v_1\otimes v_2\otimes\dots\otimes v_N)\]

Further we define \[\otimes^n_mV:=(\otimes^nV)\otimes(\otimes^mV^*)\] and call the tensor product \(\otimes^n_mV\) \(n\) times contravariant and \(m\) times covariant.

So the tensor product \(\otimes^0_0V=K\) would be the space of all scalars, \(\otimes^1_0V=V\) the space of all vectors and \(\otimes^0_1V=V^*\) the space of all linear forms \(V\rightarrow K\).

Let \(I\) be an index set \(V_i\) for all \(i\in I\) a family of vector spaces of \(K\) then \[\oplus_{i\in I} V_i:=\lbrace (v_i)_{i\in I}\in \prod_{i\in I} V_i | \text{almost all } v_i=0\rbrace\] is called the direct sum of the vector spaces \(V_i\) (see wikipedia).

The sets \[T(V):=\oplus_{i=0}^{p\geq 0}\otimes^iV=K\oplus V\oplus(V\otimes V)\oplus\dots\]\[T(V,V^*):=\oplus_{i=0,j=0}^{p\geq 0}\otimes^i_jV\] together with the the multiplication for \(v=(v^i),w=(w^i)\in T(V)\) and \(a=(a^i_j),b=(b^i_j)\in T(V,V^*)\)\[vw:=\left(\sum_{i+j=n}v^i\otimes w^j\right)_{n\geq 0}\]\[ab=\left(\sum_{i+j=m,k+l=n}a^i_j\otimes b^k_l\right)_{m,n\geq 0}\] form algebras. \(T(V)\) is called tensor algebra and \(T(V,V^*)\) mixed tensor algebra. 
The neutral element is \(1=(1_K,0,0,\dots)\).