**Read** Section 3.6 for Monday.
Work through recommended homework questions.

**Tutorials:** No tutorials next week!

**We're more than halfway done the lectures!** This is lecture 20 out
of 37.

**Office hour:** Monday, 1:30-2:30, MC103B.

**Help Centers:** Monday-Friday 2:30-6:30 in MC 106.

**Definition:** A **subspace** of $\R^n$ is any collection $S$ of
vectors in $\R^n$ such that:

1. The zero vector $\vec 0$ is in $S$.

2. $S$ is **closed under addition**:
If $\vu$ and $\vv$ are in $S$, then $\vu + \vv$ is in $S$.

3. $S$ is **closed under scalar multiplication**:
If $\vu$ is in $S$ and $c$ is any scalar, then $c \vu$ is in $S$.

**Definition:** A **basis** for a subspace $S$ of $\R^n$ is a
set of vectors $\vv_1, \ldots, \vv_k$ such that:

1. $S = \span(\vv_1, \ldots, \vv_k)$, and

2. $\vv_1, \ldots, \vv_k$ are linearly independent.

**Definition:** Let $A$ be an $m \times n$ matrix.

1. The **row space** of $A$ is the subspace $\row(A)$ of $\R^n$ spanned
by the rows of $A$.

2. The **column space** of $A$ is the subspace $\col(A)$ of $\R^m$ spanned
by the columns of $A$.

3. The **null space** of $A$ is the subspace $\null(A)$ of $\R^n$
consisting of the solutions to the system $A \vx = \vec 0$.

**Theorem 3.20:**
Let $A$ and $R$ be row equivalent matrices.
Then $\row(A) = \row(R)$.

Also, $\null(A) = \null(R)$. But elementary row operations change the column space! So $\col(A) \neq \col(R)$.

**Theorem:** If $R$ is a matrix in row echelon form, then the
nonzero rows of $R$ form a basis for $\row(R)$.

So if $R$ is a row echelon form of $A$, then a basis for $\row(A)$ is given by the nonzero rows of $R$.

Now, since $\null(A) = \null(R)$, the columns of $R$ have the same dependency relationships as the columns of $A$.

It is easy to see that the pivot columns of $R$ form a basis for $\col(R)$, so the corresponding columns of $A$ form a basis for $\col(A)$.

We learned in Chapter 2 how to use $R$ to find a basis for the **null space**
of a matrix $A$, even though we didn't use this terminology.

1. Find the reduced row echelon form $R$ of $A$.

2. The nonzero rows of $R$ form a basis for $\row(A) = \row(R)$.

3. The columns of $A$ that correspond to the columns of $R$ with leading 1's
form a basis for $\col(A)$.

4. Use back substitution to solve $R \vx = \vec 0$; the vectors that
arise are a basis for $\null(A) = \null(R)$.

Row echelon form is in fact enough. Then you look at the columns with leading nonzero entries (the pivot columns).

These methods can be used to compute a basis for a subspace $S$ spanned by some vectors $\vv_1, \ldots, \vv_k$.

The **row method**:

1. Form the matrix $A$ whose rows are $\vv_1, \ldots, \vv_k$, so $S = \row(A)$.

2. Reduce $A$ to row echelon form $R$.

3. The nonzero rows of $R$ will be a basis of $S = \row(A) = \row(R)$.

The **column method**:

1. Form the matrix $A$ whose columns are $\vv_1, \ldots, \vv_k$, so $S = \col(A)$.

2. Reduce $A$ to row echelon form $R$.

3. The columns of $A$ that correspond to the columns of $R$ with leading entries
form a basis for $S = \col(A)$.

A very similar argument works for the general case.

**Definition:** The number of vectors in a basis for a subspace $S$ is
called the **dimension** of $S$, denoted $\dim S$.

**Example:** $\dim \R^n = \query{n}$

**Example:** If $S$ is a line through the origin in $\R^2$ or $\R^3$,
then $\dim S = \query{1}$

**Example:** If $S$ is a plane through the origin in $\R^3$,
then $\dim S = \query{2}$

**Example:** If $S = \span(\colll 3 0 2, \colll {-2} 1 1, \colll 1 1 3)$,
then $\dim S = \query{2}$.

**Example:** Let $A$ be the matrix from last class whose reduced row
echelon form is
$$
R = \bmat{rrrrr}
1 & 0 & 1 & 0 & -1 \\
0 & 1 & 2 & 0 & 3 \\
0 & 0 & 0 & 1 & 4 \\
0 & 0 & 0 & 0 & 0
\emat
$$
Then: $\quad\dim \row(A) = \query{3}$
$\quad\dim \col(A) = \query{3}$
$\quad\dim \null(A) = \query{2}$

Note that $\dim \row(A) = \rank(A)$, since we defined the rank of $A$ to be the number of nonzero rows in $R$. The above theorem shows that this number doesn't depend on how you row reduce $A$.

We call the dimension of the null space the **nullity** of $A$
and write $\nullity(A) = \dim \null(A)$.
This is what we called the "number of free variables" in Chapter 2.

From the way we find the basis for $\row(A)$, $\col(A)$ and $\null(A)$, can you deduce any relationships between their dimensions?

**Theorems 3.24 and 3.26:**
Let $A$ be an $m \times n$ matrix. Then
$$
\kern-8ex
\dim \row(A) = \dim \col(A) = \rank(A) \qtext{and} \rank(A) + \nullity(A) = n .
$$

Very important!

**True/false:**
if $A$ is $2 \times 5$, then the nullity of $A$ is 3.
False. We know that $\rank(A) \leq 2$ and $\rank(A) + \nullity(A) = 5$,
so $\nullity(A) \geq 3$ (and $\leq 5$).

**True/false:**
if $A$ is $5 \times 2$, then $\nullity(A) \geq 3$.
False. $\rank(A) + \nullity(A) = 2$, so $\nullity(A) = 0$, $1$ or $2$.

**Example:** Find the nullity of
$$
M = \bmat{rrrrrrr} 1 & 2 & 3 & 4 & 5 & 6 & 7 \\
8 & 9 & 10 & 11 & 12 & 13 & 14 \emat
$$
and of $M^T$.
Any guesses?
The rows of $M$ are linearly independent, so the rank is $2$, so
the nullity is $7 - 2 = 5$.
The rank of $M^T$ is also $2$, so the nullity is $2 - 2 = 0$.

For larger matrices, you would compute the rank by row reduction.

**Theorem 3.27:**
Let $A$ be an $n \times n$ matrix. The following are equivalent:

a. $A$ is invertible.

b. $A \vx = \vb$ has a unique solution for every $\vb \in \R^n$.

c. $A \vx = \vec 0$ has only the trivial (zero) solution.

d. The reduced row echelon form of $A$ is $I_n$.

f. $\rank(A) = n$

g. $\nullity(A) = 0$

h. The columns of $A$ are linearly independent.

i. The columns of $A$ span $\R^n$.

j. The columns of $A$ are a basis for $\R^n$.

(d) $\iff$ (f): the only square matrix in row echelon form with $n$ nonzero rows is $I_n$.

(f) $\iff$ (g): follows from $\rank(A) + \nullity(A) = n$.

(c) $\iff$ (h): easy.

(i) $\implies$ (f) $\implies$ (b) $\implies$ (i): Explain.

(h) and (i) $\iff$ (j): Clear.

In fact, since $\rank(A) = \rank(A^T)$, we can add the following:

k. The rows of $A$ are linearly independent.

l. The rows of $A$ span $\R^n$.

m. The rows of $A$ are a basis for $\R^n$.

**Example 3.52:**
Show that the vectors $\colll 1 2 3$, $\colll {-1} 0 1$ and $\colll 4 9 7$
form a basis for $\R^3$.

**Solution:** Show that matrix $A$ with these vectors as the
columns has rank 3. On whiteboard.

Not covering Theorem 3.28.

**Theorem 3.29:** For every vector $v$ in $S$, there is *exactly one
way* to write $v$ as a linear combination of the vectors in $\cB$:
$$
\vv = c_1 \vv_1 + \cdots + c_k \vv_k
$$

**Proof:** Try to work it out yourself! It's a good exercise.

We call the coefficients $c_1, c_2, \ldots, c_k$ the **coordinates
of $\vv$ with respect to $\cB$**, and write
$$
[\vv]_{\cB} = \collll {c_1} {c_2} {\vdots} {c_k}
$$

**Example:** Let $S$ be the plane through the origin in $\R^3$
spanned by $\vv_1 = \colll 1 2 3$ and $\vv_2 = \colll 4 5 6$,
so $\cB = \{ \vv_1, \vv_2 \}$ is a basis for $S$.
Let $\vv = \colll 6 9 {12}$. Then
$$
\vv = 2 \vv_1 + 1 \vv_2
\qqtext{so}
[\vv]_{\cB} = \coll 2 1
$$
Note that while $\vv$ is a vector in $\R^3$, it only has **two** coordinates
with respect to $\cB$.

**Example:** Let $\cB = \{ \ve_1, \ve_2, \ve_3 \}$ be the standard basis
for $\R^3$, and consider $\vv = \colll 6 9 {12}$. Then
$$
\vv = 6 \ve_1 + 9 \ve_2 + 12\ve_3
\qqtext{so}
[\vv]_{\cB} = \colll 6 9 {12}
$$
We've implicitly been using the standard basis everywhere, but often in
applications it is better to use a basis suited to the problem.

.