User Tools

Site Tools


math121b:01-24

2020-01-24, Friday

What is a tensor?

If you say a vector is a list of numbers, v=(v1,,vn)v=(v^1, \cdots, v^n), index by i=1,,ni=1, \cdots, n, then a (rank 2) tensor is like a square matrix, TijT^{ij}, where i,j=1,,ni, j = 1, \cdots, n.

But there is something more to it.

How do a vector's coefficients change as we change basis?

As we learned last time, a vector is a list of numbers viv^i, together with a specification of a basis, eie_i, and we should write v=i=1nviei. v = \sum_{i=1}^n v^i e_i.

In this way, the change of basis formula looks easy. Say, you have another basis, called e~i\gdef\te{\tilde e} \te_i. We can express each of the new basis vector e~i\te_i in terms of the old basis e~i=jAijej \te_i = \sum_{j} A_{i}^{j} e_j and the old basis vectors in terms of the new basis ei=jBije~j e_i = \sum_j B_i^j \te_j

Hence, we can write the vector vv as linear combinations of the new basis v=i=1nviei=i=1nvi(jBije~j)=j(iviBij)e~j v = \sum_{i=1}^n v^i e_i = \sum_{i=1}^n v^i (\sum_j B_{i}^j \te_j) = \sum_j (\sum_i v^i B_i^j ) \te_j Thus, if we define v~j=iviBij\gdef\tv{\tilde v} \tv^j = \sum_i v^i B_{i}^j, we can get v=jv~je~j. v = \sum_j \tv^j \te_j.

So, how do a tensor's coefficients change as we change basis?

In the same spirit, we should write a tensor by specifying the basis T=i,jTijeiej(*)\tag{*} T = \sum_{i,j} T^{ij} \, e_i \otimes e_j

Wait a sec, what is eieje_i \otimes e_j (the \otimes is read as 'tensor')? I guess I need to tell you the following general story then.

The \otimes construction

Let VV and WW be two vector spaces over R\R, we will form another vector space denoted as VWV \otimes W. VWV \otimes W contains finite sum of the following terms vwv \otimes w where vVv \in V and wWw \in W. A general element in VWV \otimes W looks like the following v1w1++vNwN v_1 \otimes w_1 + \cdots + v_N \otimes w_N for some finite number NN. And we have the following equivalence relations (cv)w=vcw,vV,wW,cR (c v) \otimes w = v \otimes cw, \quad v \in V, w \in W, c \in \R (v1+v2)(w1+w2)=v1w1+v1w2+v2w1+v2w2, (v_1 + v_2) \otimes (w_1+w_2) = v_1 \otimes w_1 + v_1 \otimes w_2 + v_2 \otimes w_1 + v_2 \otimes w_2, where v1,v2V,w1,w2Wv_1,v_2 \in V, w_1,w_2 \in W (v1,v2v_1,v_2 are vectors, not coordinates) The second rule is called the distribution rule .

Lemma If VV has a basis {e1,,en}\{e_1, \cdots, e_n\} and WW has {d1,,dm}\{d_1, \cdots, d_m\}, then eidj,i=1,,n,  j=1,,m e_i \otimes d_j, \quad i=1,\cdots, n, \; j = 1, \cdots, m is a basis for VWV \otimes W.

Proof: It suffice to show that any element in VWV \otimes W can be written as a finite sum of eidje_i \otimes d_j. A general element in VWV \otimes W looks like the following v1w1++vNwN v_1 \otimes w_1 + \cdots + v_N \otimes w_N for some finite number NN. If we can decompose each term in the above sum as linear combination of eidje_i \otimes d_j, then we are done. So it suffices to consider the decomposition of such an element vwVWv \otimes w \in V \otimes W.

Suppose v=iaieiv = \sum_i a^i e_i and w=jbjdjw = \sum_j b^j d_j, then vw=(iaiei)(jbjdj)=i=1nj=1maibjeidj. v \otimes w = (\sum_i a^i e_i) \otimes (\sum_j b^j d_j) = \sum_{i=1}^n \sum_{j=1}^m a^i b^j e_i \otimes d_j. where we used the distribution rule of tensor. Indeed, vwv \otimes w can be written as a linear combination of eidje_i \otimes d_j. That finishes the proof. \square

So, how do a tensor's coefficients change as we change basis?

How to interpret Eq ()(*) then?

TT is an element of VVV \otimes V, and this equation T=i,jTijeiej T = \sum_{i,j} T^{ij} \, e_i \otimes e_j decomposes TT as a linear combination of the basis vectors eieje_i \otimes e_j of VVV \otimes V, with coefficients TijT^{ij}.

If we change basis, say using the new basis e~i\te_i as above, then we have a new basis for VVV \otimes V e~ie~j=(lAilel)(kAjkek)=l,kAilAjkelek \te_i \otimes \te_j = (\sum_l A_i^l e_l) \otimes (\sum_k A_j^k e_k) = \sum_{l,k} A_i^l A_j^k e_l \otimes e_k Similarly eiej=k,lBilBjke~le~k e_i \otimes e_j = \sum_{k,l} B_i^l B_j^k \te_l \otimes \te_k

In the new basis, we have T=ijklTijBilBjke~le~k=klT~lke~le~k \gdef\tT{\tilde T} T = \sum_{ijkl} T^{ij} B_i^l B_j^k \, \te_l \otimes \te_k = \sum_{kl} \tT^{lk} \te_l \otimes \te_k where T~lk=ijTijBilBjk. \tT^{lk} = \sum_{ij} T^{ij} B_i^l B_j^k.

Einstein notation

It means, if you see a pair of repeated indices, you sum over it, called “contract the repeated indicies”. For example, instead of writing v=i=1nviei v = \sum_{i=1}^n v^i e_i we can omit the summation sign, and write v=viei. v = v^i e_i.

It is often used in physics literatures, since it makes long computation formula shorter.

My notation differs from Boas's book, because I used both upper and lower indices. And the Einstein contraction is only between one upper and one lower indices.

Alternative Definition of a Tensor product

VWV \otimes W is the space of bilinear function on V×WV^* \times W^*. (see the next lecture about dual space). Recall that linear function on VV^* is just VV itself. This is a generalization.

math121b/01-24.txt · Last modified: 2020/01/27 14:54 (external edit)