2020-01-24, Friday
What is a tensor?
If you say a vector is a list of numbers, v=(v1,⋯,vn), index by i=1,⋯,n,
then a (rank 2) tensor is like a square matrix, Tij, where i,j=1,⋯,n.
But there is something more to it.
How do a vector's coefficients change as we change basis?
As we learned last time, a vector is a list of numbers vi, together with a specification of a basis, ei, and we should write
v=i=1∑nviei.
In this way, the change of basis formula looks easy. Say, you have another basis, called e~i. We can express each of the new basis vector e~i in terms of the old basis
e~i=j∑Aijej
and the old basis vectors in terms of the new basis
ei=j∑Bije~j
Hence, we can write the vector v as linear combinations of the new basis
v=i=1∑nviei=i=1∑nvi(j∑Bije~j)=j∑(i∑viBij)e~j
Thus, if we define v~j=∑iviBij, we can get
v=j∑v~je~j.
So, how do a tensor's coefficients change as we change basis?
In the same spirit, we should write a tensor by specifying the basis
T=i,j∑Tijei⊗ej(*)
Wait a sec, what is ei⊗ej (the ⊗ is read as 'tensor')? I guess I need to tell you the following general story then.
The ⊗ construction
Let V and W be two vector spaces over R, we will form another vector space denoted as V⊗W. V⊗W contains finite sum of the following terms v⊗w where v∈V and w∈W. A general element in V⊗W looks like the following
v1⊗w1+⋯+vN⊗wN
for some finite number N. And we have the following equivalence relations
(cv)⊗w=v⊗cw,v∈V,w∈W,c∈R
(v1+v2)⊗(w1+w2)=v1⊗w1+v1⊗w2+v2⊗w1+v2⊗w2,
where v1,v2∈V,w1,w2∈W (v1,v2 are vectors, not coordinates) The second rule is called the distribution rule .
Lemma
If V has a basis {e1,⋯,en} and W has {d1,⋯,dm}, then
ei⊗dj,i=1,⋯,n,j=1,⋯,m
is a basis for V⊗W.
Proof: It suffice to show that any element in V⊗W can be written as a finite sum of ei⊗dj. A general element in V⊗W looks like the following
v1⊗w1+⋯+vN⊗wN
for some finite number N. If we can decompose each term in the above sum as linear combination of ei⊗dj, then we are done. So it suffices to consider the decomposition of such an element v⊗w∈V⊗W.
Suppose v=∑iaiei and w=∑jbjdj, then
v⊗w=(i∑aiei)⊗(j∑bjdj)=i=1∑nj=1∑maibjei⊗dj.
where we used the distribution rule of tensor. Indeed, v⊗w can be written as a linear combination of ei⊗dj. That finishes the proof. □
So, how do a tensor's coefficients change as we change basis?
How to interpret Eq (∗) then?
T is an element of V⊗V, and this equation
T=i,j∑Tijei⊗ej
decomposes T as a linear combination of the basis vectors ei⊗ej of V⊗V, with coefficients Tij.
If we change basis, say using the new basis e~i as above, then we have a new basis for V⊗V
e~i⊗e~j=(l∑Ailel)⊗(k∑Ajkek)=l,k∑AilAjkel⊗ek
Similarly
ei⊗ej=k,l∑BilBjke~l⊗e~k
In the new basis, we have
T=ijkl∑TijBilBjke~l⊗e~k=kl∑T~lke~l⊗e~k
where
T~lk=ij∑TijBilBjk.
Einstein notation
It means, if you see a pair of repeated indices, you sum over it, called “contract the repeated indicies”. For example, instead of writing
v=i=1∑nviei
we can omit the summation sign, and write
v=viei.
It is often used in physics literatures, since it makes long computation formula shorter.
My notation differs from Boas's book, because I used both upper and lower indices. And the Einstein contraction is only between one upper and one lower indices.
Alternative Definition of a Tensor product
V⊗W is the space of bilinear function on V∗×W∗. (see the next lecture about dual space). Recall that linear function on V∗ is just V itself. This is a generalization.