User Tools

Site Tools


math121a-f23:august_28

August 28: Review Linear algebra 2

Today we will continue our review of linear algebra. Hopefully you have brushed up on the set notations over the weekends.

quotient space

Let VV be a vector space, and WVW \In V be a subspace. The quotient space V/WV/W is the following vector space:

  • as a set, it consist of element that looks like the shifted linear subspaces [v]=v+W={v+wwW}[v] = v+W =\{v + w \mid w \in W\}. We note that, if v1,v2Vv_1, v_2 \in V satisfies v1v2Wv_1 - v_2 \in W, then [v1]=[v2][v_1]=[v_2].
  • the vector space operations are defined using the representatives, namely
    • c[v]=[cv],cR,vV c [v]= [cv], \forall c \in \R, v \in V
    • [v1]+[v2]=[v1+v2][v_1] + [v_2] = [v_1 + v_2]

Motivation: why we care about quotient space? What's the meaning of the subspace? When we quotient something out, we are defining some equivalence relation, and we are ignoring some differences. In the quotient vector space case, suppose we want to identify the vectors in space WW as 00, we say two points v1,v2v_1, v_2 are equivalent, if their difference is in WW.

basis

A basis in VV is a collection of vectors, such that they are maximally linearly independent.

Given a basis, we can express all other vectors using linear combination of the basis. The coefficiients in the linear combination are called coordinates.

inner product

An inner product on a vector space is a function (,):V×VR(,): V \times V \to \R, such that

  • symmetric: (v,w)=(w,v)(v,w) = (w,v)
  • linear in each slot: (av1+bv2,w)=a(v1,w)+b(v2,w)(av_1 + bv_2, w) = a (v_1, w) + b(v_2, w) (by symmetry, also linear in the 2nd slot)
  • positive definite. (v,v)0(v,v) \geq 0 and =0=0 only if v=0v=0.

You may be very familiar with the notion of Rn\R^n, equipped with a (default) Euclidean inner product. But in general, for a vector space VV, the inner product is something that you give it afterwards.

A nice basis for vector space with inner product is called an orthonormal basis. e1,,enVe_1, \cdots, e_n \in V, such that (ei,ej)=δij(e_i,e_j) = \delta_{ij}.

Orthogonal projection

If VV is a vector space with an inner product, and WVW \In V is a subspace, then we can define some projection π:VW\pi: V \to W it satisfies that vπ(v)π(v). v - \pi(v) \perp \pi(v).

Exercise time

Let VR3V \In \R^3 be the points that {(x1,x2,x3)x1+x2+x3=0}\{(x_1, x_2, x_3) \mid x_1 + x_2 + x_3=0\}. Find a basis in VV, and write the vector (2,1,1)(2,-1,-1) in that basis.

Let W=R2W = \R^2, let VWV \to W be the map of forgetting coordinate x3x_3. Is this an isomorphism? What's the inverse?

Let VV as above, and WW be the line generated by vector (1,2,3)(1,2,3). Let f:VWf: V \to W be the orthogonal projection, sending vv to the closest point on WW. Is this a linear map? How do you show it? What's the kernel? Let g:WVg: W \to V be the orthogonal projection. Is it a linear map? What's the relatino between ff and gg?

math121a-f23/august_28.txt · Last modified: 2023/08/26 14:56 by pzhou