solution (thanks to an anonymous student who provided the solution)
I will update the homework after each lectures. It is due next Wednesday (since we have Labor day Monday)
1. Let $V \In \R^3$ be the points that $\{(x_1, x_2, x_3) \mid x_1 + x_2 + x_3=0\}$. Find a basis in $V$, and write the vector $(2,-1,-1)$ in that basis.
2. Let $V$ as above,. Let $W = \R^2$, let $V \to W$ be the map of forgetting coordinate $x_3$. Is this an isomorphism? What's the inverse?
3. Let $V$ as above, and let$W$ be the line generated by vector $(1,2,3)$. Let $f: V \to W$ be the orthogonal projection, sending $v$ to the closest point on $W$. Is this a linear map? How do you show it? What's the kernel? Let $g: W \to V$ be the orthogonal projection. Is it a linear map? What's the relationship between $f$ and $g$?
4. about quotient space. Let $V = \R^2$, and let $W$ be the linear subspace generated by vector $(1,2)$ (i.e. the line passing through origin and $(1,2)$). For $v \in V$, let $[v] = v+W\in V/W$ denote the equivalence class that $v$ belongs to, i.e., the (affine) line parallel to $W$ and passing through $v$. Draw some pictures to answer these questions.
5. another important notion is dual vector space. Given a vector space $V$, the dual vector space is $V^* = Hom(V, \R)$, the set of linear maps from $V$ to $\R$ (Hom is short for 'homomorphism', which means linear maps for vector spaces). For example, if $V = \R^2$, the linear functions $x$ and $y$ belong to $V^*$, we have $V^* = \{ax + by \mid a,b \in \R\}$. Here $x,y$ are basis for $V^*$.
Let $V$ be the vector space of polynomials with degree less or equal than 3. What's the dimension of $V$? What's the dimension of $V^*$? Can you find a basis for $V$? A basis for $V^*$?
1. Here is claim $1+2+3+4+\cdots = -1/12$. Show that this is wrong.
Fun fact: There is an interesting function , called Riemann Zeta function $\zeta(s)$, which for $s > 1$ can be written as $\zeta(s) = \sum_{n=1}^\infty 1/n^s$. In fact $\zeta(s)$ is actually a meromorphic function of $s$, and $\zeta(-1) = -1/12$.
2. Does the following series converge? Explain why.
3. Let $a_n$ be a sequence of $\pm 1$. Show that $\sum_{n=1}^\infty a_n / 2^n$ is convergent. (Hint: absolute convergence implies convergence)
4. What is radius of convergence? Is it true that $$ \frac{1}{1-x} = 1+ x + x^2 + \cdots $$ holds for all real number $x \neq 1$?
5. We know that the following series diverge $$ 1 + 1/2 + 1/3+ 1/4 \cdots. $$ Question: does the following alternating series converge? Why? $$ 1 - 1/2 + 1/3 - 1/4 + \cdots $$ (Optional): Fix any real number $a$. Show that by rearrange the order of the terms in the above alternating series, we can have the series converges to $a$.
6. Line integral: let $\gamma$ be the straightline from $(0,0)$ to $(1,1)$. Compute the line integral $$ \int_\gamma 2 dx + 3 dy. $$ What if we replace $\gamma$ by a curved line but still from $(0,0)$ to $(1,1)$, would the above result change? Why?
What is differentiation? it is measuring the ratio of how the output change versus how the input changes. It is a linear map from the vector space of small change of input, to the vector space of small changes of output.
There is also the chain rules, which says, if quantity x affect y,and y affect z, then x affects z. If $y=2x$, $z=3y$, then $z = 6x$.
If you have a function $f(x,y)$ that depends on two input variables, you can ask how sensitive the output is on each of them, say $$ \frac{\d f}{\d x}(x_0, y_0) = \lim_{\epsilon \to 0} \frac{f(x_0 + \epsilon, y_0) - f(x_0, y_0)}{\epsilon} $$
What is integration? It is a process of collecting stuff / contributions along the way.For example, the integral (when $f(x)$ is continuous) is the limit of the following approximations $$ \int_a^b f(x) dx = \lim_{N\to \infty} f(x_{N,i}) \Delta_N x, \quad x_{N,i} = a + \frac{b-a}{N} i, \Delta_N x = \frac{b-a}{N}. $$
The fundamental theorem of calculus says $$ \int_a^b f'(x) dx = f(b) - f(a).$$
today we will go over sequence of numbers and limit.
Let $a_1, a_2, \cdots $ be a sequence of numbers. We can have many examples of it.
We say a sequence $(a_n)$ converges to $a$, if for any $\epsilon>0$, there exists $N>0$, such that for any $n > N$, we have $|a_n - a| \leq \epsilon$.
a series is something that looks like $\sum_{n=1}^\infty a_n$. We can define the partial sum $S_n = \sum_{j=1}^n a_j$. We say the series $\sum_n a_n$ convergers if and only the partial sum converges.
Series is like a discretized version of integral.
1. what does absolute convergence mean for series?
2. the model convergent series
3. various tests
some exercise from Boas's textbook, try 1,3, 4-8
Today we will continue our review of linear algebra. Hopefully you have brushed up on the set notations over the weekends.
Let $V$ be a vector space, and $W \In V$ be a subspace. The quotient space $V/W$ is the following vector space:
Motivation: why we care about quotient space? What's the meaning of the subspace? When we quotient something out, we are defining some equivalence relation, and we are ignoring some differences. In the quotient vector space case, suppose we want to identify the vectors in space $W$ as $0$, we say two points $v_1, v_2$ are equivalent, if their difference is in $W$.
A basis in $V$ is a collection of vectors, such that they are maximally linearly independent.
Given a basis, we can express all other vectors using linear combination of the basis. The coefficiients in the linear combination are called coordinates.
An inner product on a vector space is a function $(,): V \times V \to \R$, such that
You may be very familiar with the notion of $\R^n$, equipped with a (default) Euclidean inner product. But in general, for a vector space $V$, the inner product is something that you give it afterwards.
A nice basis for vector space with inner product is called an orthonormal basis. $e_1, \cdots, e_n \in V$, such that $(e_i,e_j) = \delta_{ij}$.
If $V$ is a vector space with an inner product, and $W \In V$ is a subspace, then we can define some projection $$\pi: V \to W $$ it satisfies that $ v - \pi(v) \perp \pi(v). $
Let $V \In \R^3$ be the points that $\{(x_1, x_2, x_3) \mid x_1 + x_2 + x_3=0\}$. Find a basis in $V$, and write the vector $(2,-1,-1)$ in that basis.
Let $W = \R^2$, let $V \to W$ be the map of forgetting coordinate $x_3$. Is this an isomorphism? What's the inverse?
Let $V$ as above, and $W$ be the line generated by vector $(1,2,3)$. Let $f: V \to W$ be the orthogonal projection, sending $v$ to the closest point on $W$. Is this a linear map? How do you show it? What's the kernel? Let $g: W \to V$ be the orthogonal projection. Is it a linear map? What's the relatino between $f$ and $g$?
Let's start from scratch again. What is linear algebra?
This is a textbook on linear algebra by Prof Givental.
row vectors, column vectors, matrices. Let's also review the index notation $a_i = \sum_{j} M_{ij} b_j$
Very concrete, very computable.
Geometrical, as we went over in class.
A 2-dimentional vector is something you can draw.
A 3-dim vector, hmm, harder.
how about 4-dim vector? $\infty$-dim one? It doesn't matter the dimension, the rule we obtain from 2 and 3 dimensional one is good enough.
the goofy math prof: a vector space is a set $V$ together with two operations
such that, some obvious conditions should be satisfied.
Why we care about this? Because it is somehow useful.
For example,
How does vector space talk to each other? Linear map.
Do an example of stretching, skewing.
Do a non-example of bending a line in $\R^2$.
Kernel, image and cokernel
We didn't quite cover the idea of a quotient space. I will do that next time.