1. Review
Ross Ch.1 Introduction
- We defined the set of natural numbers and explored its successor property:
if $n$ $\epsilon$ $\mathbb{N}$ then $n+1$ also in $\mathbb{N}$.
- We defined Induction, a method which is used to prove an infinite number of successive propositions. Induction always begins by defining a base case. If some proposition holds for this base case then one may assume the proposition hold for all $n$ and then prove that it must then hold true for $n+1$.
- We also defined the set of rational numbers $\mathbb{Q}$ and the set of integers $\mathbb{Z}$ both of which have the property of being ordered meaning that there is a notion of some greater than and less than between all of their elements.
- The Rational Zeros Theorem states that the only rational candidates for solutions of the polynomial:
$c_nx^n+c_{n-1}x^{n-1}+\cdots+c_1x+c_0=0$
have the form $r=\frac{c}{d}$ where $c$ divides $c_0$ and $d$ divides $c_n$
Example: $\sqrt{2}$ is not a rational number
The only rational solutions of $x^2-2=0$ are $\pm1$, $\pm2$. $\sqrt{2}$ is a solution to $x^2-2=0$ thus it is not a rational number.
It can be seen that the Rational Zeros Theorem provides a convenient way of identifying potential rational roots of very complex polynomials.
- Next we discussed the set of Real Numbers $\mathbb{R}$ which can be represented as the real number line.
- $\mathbb{R}$ is an ordered field. It is the mathematical system on which we will do our analysis.
- maximum: Let $S\subset\mathbb{R}$, $\alpha$ $\epsilon$ $S$ $\alpha$ is a maximum if $\forall$ $\beta$ $\epsilon$ $S$, $\alpha$ $\geq$ $\beta$
- minimum: Let $S\subset\mathbb{R}$, $\alpha$ $\epsilon$ $S$ $\alpha$ is a minimum if $\forall$ $\beta$ $\epsilon$ $S$, $\alpha\leq\beta$
- {$r$ $\epsilon$ $\mathbb{Q}$ : $0 \leq r \leq \sqrt{2}$} has a minimum of 0 but does not have a maximum because $\sqrt{2}$ does not belong to the set but there are rationals in the set arbitrarily close to $\sqrt{2}$
- supremum If S is bounded above and S has a least upper bound then we call this the supremeum of S
- infimum If S is bounded below and has a greatest lower bound then we call this the infimum of S
Example: inf{$n^{(-1)^n}$ : $n \epsilon \mathbb{N}$} = 0
The Completeness Axiom: Every nonempty subset S of $\mathbb{R}$ that is bounded above has a least upper bound i.e. $\sup$S exists. Same goes for lower bounds and the greatest lower bound.
- The Completeness Axiom does not hold for $\mathbb{Q}$
- Archimedean Property: if a > 0 and b > 0 then $\exists n \epsilon \mathbb{Z}^+$ : na > b
- Denseness of $\mathbb{Q}$: If a,b $\epsilon \mathbb{R}$ and a < b, then $\exists r \epsilon \mathbb{Q}$: a < r < b
- $\infty$ and $-\infty$ are not real numbers but are very useful to include. $[a, \infty) = \{x \epsilon \mathbb{R} : a \leq x \}$ $(a, \infty) = \{x \epsilon \mathbb{R} : a < x \}$ $( -\infty, b] = \{ x \epsilon \mathbb{R} : x \leq b \}$ $( -\infty, b) = \{x \epsilon \mathbb{R} : x < b \}$
Ross Ch. 2 Sequences
- definition of sequence convergent: $(s_n)$ converges to $s$ if ${\forall \epsilon > 0, \exists N}$ such that ${n > N \Rightarrow |s_n - s| < \epsilon.}$
example: show that $\lim_{n \rightarrow \infty}{\frac{1}{n^2}} = 0$.
Let $\epsilon > 0$, ${N = \frac{1}{\sqrt{\epsilon}}}$. Then ${n > N \Rightarrow n > \frac{1}{\sqrt{\epsilon}} \Rightarrow n^2 > \frac{1}{\epsilon}}$ and hence ${\epsilon > \frac{1}{n^2}}$. Thus $n > N \Rightarrow |\frac{1}{n^2} - 0| < \epsilon$.
example: Let $\lim_{n \rightarrow \infty}{s_n} = s \neq 0$ Prove $\inf\{|s_n| : n \epsilon \mathbb{N}\} > 0$
Let $\epsilon = \frac{1}{2}|s| > 0$ ${\exists N \epsilon \mathbb{N}}$ so that ${n > N \Rightarrow |S_n - s| < \frac{|s|}{2}}$. $n > N \Rightarrow |s_n| \geq \frac{|s|}{2}$ otherwise the triangle inequality would yield $|s| < |s|$. If we set $m = min\{\frac{|s|}{2}, |s_1|, …, |s_N|\}$ then we have ${m > 0, |s_n| \geq m \forall n \epsilon \mathbb{N}}$ thus $\inf\{|s_n| : n \epsilon \mathbb{N}\} \geq m > 0$
- 9.1 Theorem: Convergent sequences are bounded
- Additivity and homogeneity are properties of convergence. i.e. if $\lim{s_n} = s$ and $k \epsilon \mathbb{R}$ then $\lim{ks_n} = ks$ and if $\lim{s_n} = s$, $\lim{t_n} = t$ then $\lim{s_n + t_n} = s + t$
- $\lim{\frac{1}{s_n}} = \frac{1}{s}$, it follows that $\lim{\frac{t_n}{s_n}} = \frac{t}{s}$
- $\lim{s_n} = \infty \Leftrightarrow \lim{\frac{1}{s_n}} = 0$
- 10.2 Theorem: All bounded monotone sequences converge
example: Prove $(s_n)$ is monotone bounded.
${s_1 = 5}$ ${s_n = \frac{s_{n-1}^2 + 5}{2s_{n-1}}}$ for ${n \geq 2}$
soln:
${s_n > \sqrt{5}}$ for $n \leq 2$. Assume this holds for some $n \geq 2$
Need to show ${\frac{s_{n-1}^2 + 5}{2s_{n-1}} < s_{n+1} \Leftrightarrow s_{n+1}^2 > 5}$
this holds because $s_{n+1} > \sqrt{5}$ by the inductive hypothesis. To show ${s_{n+2} > \sqrt{5}}$ we need ${\frac{s_{n-1}^2 + 5}{2s_{n-1}} > \sqrt{5} \Leftrightarrow {s_{n+1}- \sqrt{5}}^2 > 0}$ Thus ${n+1 \leq n \forall n \epsilon \mathbb{N}}$ by induction.
definition: ${\lim \sup s_n = \lim_{N \rightarrow \infty} \sup{s_n : n > N}}$
definition: ${\lim \inf s_n = \lim_{N \rightarrow \infty} \inf{s_n : n > N}}$
It follows that ${\lim \sup s_n \leq \sup{s_n : n \epsilon \mathbb{N}}$ and ${\lim \inf s_n \geq \inf{s_n : n \epsilon \mathbb{N}}$
definition: Cauchy Sequence $s_n$ is Cauchy if for each ${\epsilon > 0 \exists N}$ such that ${m,n > N \Rightarrow |s_n-s_m| < \epsilon}$
- 10.11 Theorem: Convergent $\Leftrightarrow$ Cauchy
Subsequences
-11.2 Theorem Let $s_n$ be a sequence
(i) If $t$ is in $\mathbb{R}$ then there is a subsequence of $(s_n)$ converging to t iff $\n \epsilon \mathbb{N} : |s_n - t| < \epsilon \}$ is infinite for all $\epsilon > 0$
-11.3 Theorem
If $(s_n)$ converges, then every subsequence converges to same limit.
-11.4 Theorem
Every sequence $(s_n)$ has a monotonic subsequence
-11.5 Bolzano-Weierstrauss Theorem
Every bounded sequence has a convergent subsequence.
lim sup and lim inf
Rudin ch. 2 Basic Topology
First we define the set definitions of function, image, inverse image, onto, and one-to-one. I offer the definition of one-to-one below:
Let A and B be two sets and let $f$ be a mapping of A into B. If $\forall y \epsilon B$, $f^{-1}(y)$ consists of at most one element of A then $f$ is a one-to-one mapping of A into B.
Then we introduce the notions of size such as finite, infinite, countable, uncountable, and at most countable
- 2.8 Theorem
Every infinite subset of a countable set is countable.
union and intersection
- the infinite union of countable sets is countable.
Metric Spaces
A metric is a distance function which must satisfy the following properties:
for points p and q in X
1. $d(p, q) > 0$ if $p \neq q$, $d(p, p) = 0$
2. $d(p, q) - d(q, p)$
3. $d(p, q) \leq d(p, r) + d(r, q) \forall r \epsilon X$
Distance in $\mathbb{R}^k$:
$d(x, y) = |x - y|$ for $x, y \epsilon \mathbb{R}^k$
open and closed balls
An open ball with center $x$ and radius $r$ is defined as $\{y \epsilon \mathbb{R}^k : |y - x| < r\}$
A closed ball with center $x$ and radius $r$ is defined as $\{y \epsilon \mathbb{R}^k : |y - x| \leq r\}$
We are given several definitions as follows:
- neighborhood of $p$ of radius $r$ := the set of all $q$ such that $d(p, q) < r$
- $p$ is a limit point of set E if every neighborhood of $p$ contains some point $q \neq p$ such that $q \epsilon E$.
- E is closed if every limit point of E is point in E.
- $p$ is an interior point of E if there is a neighborhood N of $p$ such that N $\subset$ E.
- E is open if every point of E is an interior point of E.
- The complement of E is the set of all points $p \epsilon X$ such that $p \notin E$ where $E \subset X$.
- E is bounded if $\exists M \epsilon \mathbb{R}$ and $q \epsilon X$ such that $d(q, p) < M \forall p \epsilon E$
- E is dense in X if every point of X is a limit point and\or a point of E.
Example: The set of all integers
Open: No Closed: Yes Perfect: No Bounded: No
2.19 Theorem
Every neighborhood is an open set.
2.20 Theorem
If $p$ is a limit point of E then every neighborhood of $p$ contains infinitely many points in E.
2.23 Theorem
A set is open iff its complement is closed.
2.24 Theorem
1. arbitrary union of open sets is open
2. arbitrary intersection of closed sets is closed
3. finite intersection of open sets is open
4. finite union of closed sets is closed
Proof uses the relation: $$(\bigcap_{\alpha}F_{\alpha})^c = \bigcup_{\alpha}F_{\alpha}^c$$
Next we introduced notion of a closure of a set which is equal to the set but with the addition of its limit points.
Compact Sets
open cover of E in metric space X is a collection{$G_{\alpha}$} of open subsets of X such that $E \subset \bigcup_{\alpha} G_{\alpha}$
K in X is compact if every open cover of K contains a finite subcover. i.e. There exist finite indices such that: $$K \subset G_1 \cup G_2 \cup \cdots \cup G_m$$
Heine-Borel Theorem: If E in $\mathbb{R}^k then E compact \Leftrightarrow E closed and bounded.
Example: $(0, 1) \epsilon \mathbb{R}^k$ is not compact because it is not closed and so we have $$\bigcup (\frac{1}{n}, 1 - \frac{1}{n})$$ which is an open cover of (0, 1) that does not have a finite subcover.
Compactness brings us many interesting theorems such as Theorem 2.35 closed subsets of compact sets are compact
Theorem 2.41: If E is in $\mathbb{R}^k$ then TFAE
(a) E is closed and bounded
(b) E is compact
c) Every infinite subset of E has a limit point in E
Continuity
3 Definitions:
(Credit to Kaylene Stocking because I really liked the way she described the definitions)
The limit perspective: Let $(x_n)$ be a sequence of points in the domain of $f$ that converges to some $x_0 \epsilon S$. The sequence $f(xn)$ must converge to $f(x_0)$
The bounded rate of change perspective: Pick any point $x_0 \epsilon S$ and any $\varepsilon > 0$. There must exist some $\delta > 0$ so that moving less than $\delta$ away from $x_0$ results in a change of less than $\varepsilon$ in the value of $f(x)$.
formal definition: $f$ is continuous at point $p$ if $\forall \varepsilon > 0 \exists \delta > 0$ such that $$d_y(f(x), f(p)) < \varepsilon$$ $$\forall x \epsilon E$$ for which $$d_x(x, p) < \delta$$ Then if f is continuous at every p in its domain, f is continuous overall.
The topological perspective: Let E be an open subset of the range of f. The set of points in the domain of f that f maps into E must also be open.
Uniform Convergence
-Definition: Let $(f_n)$ be a seq. of real-valued functions defined on set $S \subseteq \mathbb{R}$. $(f_n)$ converges uniformly on S to a function $f$ if $$\forall \varepsilon > 0, \exists N$$ such that $$|f_n(x) - f(x)| < \varepsilon \forall x \epsilon S, \forall n > N$$
Example: Let $f_n(x) = nx^n$ for $x \epsilon [0, 1)$. We show that convergence is not uniform. If it were there would exist $N \epsilon \mathbb{N}$ such that $$|nx^n - 0| < 1 \forall x \epsilon [0, 1), n > N$$ In particular we would have $(N + 1)x^{N + 1} < 1 \forall x \epsilon [0, 1)$. But this fails for $x$ sufficiently close to 1: consider $\frac{1}{(N + 1)^\frac{1}{N + 1}}$.
Differentiation
two definitions: $$ f'(x) = \lim_{t \rightarrow x} \frac{f(t) - f(x)}{t-x}, t \neq x$$ $$ f'(x) = \lim_{h \rightarrow 0} \frac{f(x+h)-f(x)}{h}$$
5.2 Theorem: If $f$ is defferentiable at $x$ then $f$ is continuous at $x$.
Mean Value Theorems
These theorems let us extrapolate data about arbitrary points on a function in a range [a, b] for which f is defined. If a point on f is a local min or local max then its derivative at that point is 0.
Generalized MVT: If $f, g$ are continuous on [a, b] and differentiable on (a, b) then $\exists x \epsilon (a, b)$ such that $$[f(b) - f(a)]g'(x) = [g(b) - g(a)]f'(x)$$
MVT: $\frac{f(b) - f(a)}{b - a} = f'(x)$
Rolle's Theorem (special case of MVT): If f is continuous and differentiable on (a, b) then $f(a) = f(b)$ then there exists at least one x in (a, b) such that $f'(x) = 0$.
Continuity of Derivatives Theorem: Suppose $f$ is a real differentiable function on [a, b] and suppose $f'(a) < \lambda < f'(b)$ then there is a point x in (a, b) such that $f'(x) = \lambda$
-useful corollary: If f is differentiable on (a, b) then f can not have any simple discontinuities on (a, b).
L'Hopital's Theorem
Let $s$ signify $a$, $a^+$, $a^-$, $\infty$, or $-\infty$ where $a \epsilon \mathbb{R}$ and suppose $f$ and $g$ are differentiable functions for which the following limit exists: $$\lim_{x \rightarrow s} \frac{f'(x)}{g'(x)} = L$$ If $$\lim_{x \rightarrow s} f(x) = \lim_{x \rightarrow s} g(x) = 0$$ or if $$ \lim_{x \rightarrow s} |g(x)| = \infty$$ then $$\lim_{x \rightarrow s} \frac{f(x)}{g(x)} = L$$ In other words, we can take the derivative of the top and bottom when f over g is in some indeterminate form and this will give us the same limit and may be easier to calculate the limit from.
Taylor's Theorem
If $f$ is a smooth function (is infinitely differentiable) then $$\sum_{k = 0}^{\infty} \frac{f^k©}{k!}(x - c)^k$$ is the Taylor series of f about c. We can stop k at a certain index to get a partial representation of $f$ with some error. The error formula is: $$R_n(x) = f(x) - \sum_{k = 0}{n - 1} \frac{f^k©}{k!}(x - c)^k$$ The remainder depends on f and c. Furthermore we may have that $$f(x) = \sum_{k = 0}^{\infty} \frac{f^k©}{k!}(x - c)^k$$ iff $$\lim_{n \rightarrow \infty} R_n(x) = 0$$ But the remainder need not always tend to zero. Hence we can have $f$ not given exactly by its Taylor series.
The actual Taylor's Theorem
Let $f$ be defined on (a, b) where a < c < b. Suppose the nth derivative of f(x) exists on (a, b). Then for each $x \neq c$ in (a, b) there is some $y$ between c and x such that $$R_n(x) = \frac{f^n(y)}{n!}(x - c)^n$$
2. Questions
1. I do not understand the solution to HW10 question 4:
Prove $f(x) = \sum_{n=1}^{\infty} 4^{-n} \varphi (4^n x)$ is everywhere continuous but nowhere differentiable, $\varphi$(x)=min{$|x-n||n \epsilon \mathbb{Z}$}.
The goal of this problem is to essentially create a function which has edges everywhere so that it is not differentiable at any point. The graph of $\varphi$ as given in the problem is nondifferentiable at $x=n+\frac{1}{2}$ where $n$ is any integer.We are then gping to define $h_n(x)$ to be a multiple of $\frac{1}{4}$ or $\frac{-1}{4}$ function that is to be honest I still have no idea what this proof is saying lol.
2. Proof of Theorem 19.2 from Ross: If $f$ is continuous on closed interval [a, b] then $f$ is uniformly continuous on [a, b]. Proof states that a subsequence $x_{n_k}$ of $x_n$ converges to $x_0 = \lim_k{x_{n_k}} = \lim_k{y_{n_k}}$ since $f$ is continuous at $x_0$. I am unsure how they achieve $\lim_k{x_{n_k}} = \lim_k{y_{n_k}}$ also I am unsure how this theorem is true given we can have very steep slopes on closed intervals which make it difficult or impossible to assign a delta to every $\epsilon > 0$ such that uniform continuity is achieved.
3. Theorem 9.1 of Ross: Convergent sequences are bounded. What if $s_n = \frac{1}{n-1}$ and $n$ starts at 1?
Answer: This sequence doesn't make sense because the first term is undefined.
4. In proof of Ross Theorem 33.4 (ii): ${(c, d) \subseteq [a, b]}$, how could ${(c, d) = [a, b]}$?
Answer: (Courtesy of Prof) The two cannot be equal but the statement still reads true.
5. In proof of Ross Theorem 33.5, I am unsure how one can claim M($|f|$, S) $-$ ${m}$($|f|$, S) $\leq$ M($|f|$, U) $-$ ${m}$($|f|$, U)
I believe the answer to this question is due to the fact that $\inf |f| \leq \inf f$
6. Does a subsequence have to have an equation (in terms of k) for generating the n-index of $s_n$ or could we technically just pick random n indices?
7. From Rudin ch. 2 pg. 33: Why is the set of all complex $z$ such that $|z| \leq 1$ a perfect set and why is $z$ such that $|z| < 1$ not perfect?
8. Why is a nonempty finite set closed? Answer: It has no interior points. Thus it cannot be open. Furthermore a single point is closed and we have the property that the union of finitely many closed sets is closed. A nonempty finite set is obviously the union of single points and thus is closed.
9. Is the set $(−1,1) \cap \mathbb{Q}$ open in $\mathbb{Q}$? Is it closed in $\mathbb{Q}?
Answer: $(-1, 1)$ is open in \mathbb{Q} by definition of the induced topology. The induced topology states that …
10. Is the set ${(\sqrt{− 2}, \sqrt{2}) \cap \mathbb{Q}}$ open in $\mathbb{Q}$? Is it closed in $\mathbb{Q}$? It is both open and closed because ${(\sqrt{− 2}, \sqrt{2}) \cap \mathbb{Q} = [\sqrt{-2}, \sqrt{2}] \cap \mathbb{Q}}$.
11. What is the difference between regular convergence and uniform convergence?
Answer: Uniform convergence deals with functions, in order to satisy uniform convergence a sequence of functions must converge towards the limit function for every point $x \epsilon \mathbb{R}$. For each ${\varepsilon > 0 \exists N}$ such that ${|f_n(x) − f(x)| < \varepsilon} {\forall x \epsilon S}, {\forall n > N}$. Regular convergence does not deal with functions but rather single values in some field.
12. Is the set $(0, 2) \cap \mathbb{Q}$ connected? No, choose $\gamma$ to be an irrational number. Let $S = (0, 2) \cap \mathbb{Q}$. Let $S_1 = S \cap (-\infty, \gamma), S_2 = S \cap (\gamma, \infty)$ Then $S_1$ and $S_2$ are both open and nonempty in S. Thus S is disconnected.
13. Find a subset K ⊂ Q, such that K is closed and bounded in Q, but not compact.
14. Find a subset K ⊂ Q, such that K is compact and K is an infinite set.
15. If A is open, then f(A) is open. FALSE why? Answer: consider the continuous mapping from $(0, 1) \epsilon \mathbb{R}$ to $\{0\}$. The range is closed but the domain is open.
16. If A is closed, then f(A) is closed. FALSE, why? Answer:
17. Why is $\{f^{-1}(U_0) : U_0 \epsilon U\}$ a cover of E, where E is the domain set of $f$ and $U$ is the open cover of $f(E)$? Answer: Let $x \epsilon E$. Then as $U$ is a cover of $f(E)$, $f(x) \epsilon U_0$ for some $U_0 \epsilon U$. Thus $x \epsilon f^{-1}(U).$