Math 1600A Lecture 20, Section 2, 25 Oct 2013

$ \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\query}[1]{\toggle{\text{?}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\text{?}}{#1}\endtoggle} $


Read Section 3.6 for Monday. Work through recommended homework questions.

Tutorials: No tutorials next week!

We're more than halfway done the lectures! This is lecture 20 out of 37.

Office hour: Monday, 1:30-2:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.

Partial review of Lectures 18 and 19:


Definition: A subspace of $\R^n$ is any collection $S$ of vectors in $\R^n$ such that:
1. The zero vector $\vec 0$ is in $S$.
2. $S$ is closed under addition: If $\vu$ and $\vv$ are in $S$, then $\vu + \vv$ is in $S$.
3. $S$ is closed under scalar multiplication: If $\vu$ is in $S$ and $c$ is any scalar, then $c \vu$ is in $S$.


Definition: A basis for a subspace $S$ of $\R^n$ is a set of vectors $\vv_1, \ldots, \vv_k$ such that:
1. $S = \span(\vv_1, \ldots, \vv_k)$, and
2. $\vv_1, \ldots, \vv_k$ are linearly independent.

Subspaces associated with matrices

Definition: Let $A$ be an $m \times n$ matrix.

1. The row space of $A$ is the subspace $\row(A)$ of $\R^n$ spanned by the rows of $A$.
2. The column space of $A$ is the subspace $\col(A)$ of $\R^m$ spanned by the columns of $A$.
3. The null space of $A$ is the subspace $\null(A)$ of $\R^n$ consisting of the solutions to the system $A \vx = \vec 0$.

Theorem 3.20: Let $A$ and $R$ be row equivalent matrices. Then $\row(A) = \row(R)$.

Also, $\null(A) = \null(R)$. But elementary row operations change the column space! So $\col(A) \neq \col(R)$.

Theorem: If $R$ is a matrix in row echelon form, then the nonzero rows of $R$ form a basis for $\row(R)$.

So if $R$ is a row echelon form of $A$, then a basis for $\row(A)$ is given by the nonzero rows of $R$.

Now, since $\null(A) = \null(R)$, the columns of $R$ have the same dependency relationships as the columns of $A$.

It is easy to see that the pivot columns of $R$ form a basis for $\col(R)$, so the corresponding columns of $A$ form a basis for $\col(A)$.

We learned in Chapter 2 how to use $R$ to find a basis for the null space of a matrix $A$, even though we didn't use this terminology.


Finding bases for $\row(A)$, $\col(A)$ and $\null(A)$:

1. Find the reduced row echelon form $R$ of $A$.
2. The nonzero rows of $R$ form a basis for $\row(A) = \row(R)$.
3. The columns of $A$ that correspond to the columns of $R$ with leading 1's form a basis for $\col(A)$.
4. Use back substitution to solve $R \vx = \vec 0$; the vectors that arise are a basis for $\null(A) = \null(R)$.

Row echelon form is in fact enough. Then you look at the columns with leading nonzero entries (the pivot columns).

These methods can be used to compute a basis for a subspace $S$ spanned by some vectors $\vv_1, \ldots, \vv_k$.

The row method:

1. Form the matrix $A$ whose rows are $\vv_1, \ldots, \vv_k$, so $S = \row(A)$.
2. Reduce $A$ to row echelon form $R$.
3. The nonzero rows of $R$ will be a basis of $S = \row(A) = \row(R)$.

The column method:

1. Form the matrix $A$ whose columns are $\vv_1, \ldots, \vv_k$, so $S = \col(A)$.
2. Reduce $A$ to row echelon form $R$.
3. The columns of $A$ that correspond to the columns of $R$ with leading entries form a basis for $S = \col(A)$.

New material

Dimension and Rank

We have seen that a subspace has many bases. Have you noticed anything about the number of vectors in each basis?

Idea of proof:

A very similar argument works for the general case.

Definition: The number of vectors in a basis for a subspace $S$ is called the dimension of $S$, denoted $\dim S$.

Example: $\dim \R^n = \query{n}$

Example: If $S$ is a line through the origin in $\R^2$ or $\R^3$, then $\dim S = \query{1}$

Example: If $S$ is a plane through the origin in $\R^3$, then $\dim S = \query{2}$

Example: If $S = \span(\colll 3 0 2, \colll {-2} 1 1, \colll 1 1 3)$, then $\dim S = \query{2}$.

Example: Let $A$ be the matrix from last class whose reduced row echelon form is $$ R = \bmat{rrrrr} 1 & 0 & 1 & 0 & -1 \\ 0 & 1 & 2 & 0 & 3 \\ 0 & 0 & 0 & 1 & 4 \\ 0 & 0 & 0 & 0 & 0 \emat $$ Then: $\quad\dim \row(A) = \query{3}$ $\quad\dim \col(A) = \query{3}$ $\quad\dim \null(A) = \query{2}$

Note that $\dim \row(A) = \rank(A)$, since we defined the rank of $A$ to be the number of nonzero rows in $R$. The above theorem shows that this number doesn't depend on how you row reduce $A$.

We call the dimension of the null space the nullity of $A$ and write $\nullity(A) = \dim \null(A)$. This is what we called the "number of free variables" in Chapter 2.

From the way we find the basis for $\row(A)$, $\col(A)$ and $\null(A)$, can you deduce any relationships between their dimensions?

Theorems 3.24 and 3.26:

Very important!


True/false: for any $A$, $\rank(A) = \rank(A^T)$.

True/false: if $A$ is $2 \times 5$, then the nullity of $A$ is 3.

True/false: if $A$ is $5 \times 2$, then $\nullity(A) \geq 3$.

Example: Find the nullity of $$ M = \bmat{rrrrrrr} 1 & 2 & 3 & 4 & 5 & 6 & 7 \\ 8 & 9 & 10 & 11 & 12 & 13 & 14 \emat $$ and of $M^T$. Any guesses?

For larger matrices, you would compute the rank by row reduction.

Fundamental Theorem of Invertible Matrices, Version 2

Theorem 3.27: Let $A$ be an $n \times n$ matrix. The following are equivalent:
a. $A$ is invertible.
b. $A \vx = \vb$ has a unique solution for every $\vb \in \R^n$.
c. $A \vx = \vec 0$ has only the trivial (zero) solution.
d. The reduced row echelon form of $A$ is $I_n$.
f. $\rank(A) = n$
g. $\nullity(A) = 0$
h. The columns of $A$ are linearly independent.
i. The columns of $A$ span $\R^n$.
j. The columns of $A$ are a basis for $\R^n$.

Proof: We saw that (a), (b), (c) and (d) are equivalent in Theorem 3.12. The new ones are easier:

(d) $\iff$ (f): the only square matrix in row echelon form with $n$ nonzero rows is $I_n$.

(f) $\iff$ (g): follows from $\rank(A) + \nullity(A) = n$.

(c) $\iff$ (h): easy.

(i) $\implies$ (f) $\implies$ (b) $\implies$ (i): Explain.

(h) and (i) $\iff$ (j): Clear.





In fact, since $\rank(A) = \rank(A^T)$, we can add the following:

k. The rows of $A$ are linearly independent.
l. The rows of $A$ span $\R^n$.
m. The rows of $A$ are a basis for $\R^n$.

Example 3.52: Show that the vectors $\colll 1 2 3$, $\colll {-1} 0 1$ and $\colll 4 9 7$ form a basis for $\R^3$.

Solution: Show that matrix $A$ with these vectors as the columns has rank 3. On whiteboard.

Not covering Theorem 3.28.


Suppose $S$ is a subspace of $\R^n$ with a basis $\cB = \{ \vv_1, \ldots, \vv_k \}$, so $S$ has dimension $k$. Then we can assign coordinates to vectors in $S$, using the following theorem:

Theorem 3.29: For every vector $v$ in $S$, there is exactly one way to write $v$ as a linear combination of the vectors in $\cB$: $$ \vv = c_1 \vv_1 + \cdots + c_k \vv_k $$

Proof: Try to work it out yourself! It's a good exercise.

We call the coefficients $c_1, c_2, \ldots, c_k$ the coordinates of $\vv$ with respect to $\cB$, and write $$ [\vv]_{\cB} = \collll {c_1} {c_2} {\vdots} {c_k} $$

Example: Let $S$ be the plane through the origin in $\R^3$ spanned by $\vv_1 = \colll 1 2 3$ and $\vv_2 = \colll 4 5 6$, so $\cB = \{ \vv_1, \vv_2 \}$ is a basis for $S$. Let $\vv = \colll 6 9 {12}$. Then $$ \vv = 2 \vv_1 + 1 \vv_2 \qqtext{so} [\vv]_{\cB} = \coll 2 1 $$ Note that while $\vv$ is a vector in $\R^3$, it only has two coordinates with respect to $\cB$.

Example: Let $\cB = \{ \ve_1, \ve_2, \ve_3 \}$ be the standard basis for $\R^3$, and consider $\vv = \colll 6 9 {12}$. Then $$ \vv = 6 \ve_1 + 9 \ve_2 + 12\ve_3 \qqtext{so} [\vv]_{\cB} = \colll 6 9 {12} $$ We've implicitly been using the standard basis everywhere, but often in applications it is better to use a basis suited to the problem.