**Read** Section 2.4 for next class.
Work through recommended homework questions.

**Midterm 1** is tomorrow 7-8:30pm.
It covers until the end of Section 2.2, except for
linear systems over $\Z_m$.
A **practice exam** is available from the course home page.
Last name A-Q must write in **NS1**, R-Z in **NS7**.
See the missed exam section of
the course web page for policies, including for illness.

**Tutorials:** No quiz, focused on review. Take advantage of them!
No quizzes next week either.

**Office hour:** today, 12:30-1:30, MC103B.

**Help Centers:** Monday-Friday 2:30-6:30 in MC 106.

These **lecture notes** now available in pdf format as well,
a day or two after each lecture.
Be sure to let me know of technical problems.

**Example:** Is $\colll 4 8 6$ a linear combination of
$\colll 4 5 6$ and $\colll 2 1 3$?

That is, can we find scalars $x$ and $y$ such that $$x \colll 4 5 6 + y \colll 2 1 3 = \colll 4 8 6 ?$$

Expanding this into components, this becomes a linear system
$$
\begin{aligned}
4 x + 2 y &= 4 \\
5 x + \ph y &= 8 \\
6 x + 3 y &= 6
\end{aligned}
\qquad\text{with augmented matrix}\qquad
\bmat{rr|r}
4 & 2 & 4 \\
5 & 1 & 8 \\
6 & 3 & 6
\emat
$$
and we **already know** how to determine whether this system is consistent:
use **row reduction**!

**Theorem 2.4:** A system with augmented matrix $[A \mid \vb \,]$ is
consistent if and only if $\vb$ is a linear combination of the columns of $A$.

This gives a **different** geometrical way to understand the solutions to
a system.

If $\span(S) = \R^n$, then $S$ is called a

**Example:** $\span(\ve_1, \ve_2, \ldots, \ve_n) = \R^n$.

**Example:** The span of $\vu = \colll 1 2 3$ and $\vv = \colll 4 5 6$
consists of every vector $\vx$ that can be written as
$$
\vx = s \vu + t \vv
$$
for some scalars $s$ and $t$.
Since $\vu$ and $\vv$ are not parallel, this is the plane through the origin in $\R^3$
with direction vectors $\vu$ and $\vv$.

**Question:** We saw that $\span(\coll 1 0, \coll 0 1) = \R^2$.
What is $\span(\coll 1 0, \coll 0 1, \coll 2 4)$?

**Question:** What vector is always in $\span(\vv_1, \vv_2, \ldots, \vv_k)$?

**Definition:** A set of vectors $\vv_1, \ldots, \vv_k$ is
**linearly dependent** if there are scalars $c_1, \ldots, c_k$,
__at least one of which is nonzero__, such that
$$
c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 .
$$
Since at least one of the scalars is non-zero, the corresponding vector can be
expressed as a linear combination of the others.

**Example:** $ \coll {-2} 4 - 2 \coll {-1} 2 + 0 \coll 5 6 = \coll 0 0 $,
so the vectors $\coll {-2} 4$, $\coll {-1} 2$ and $\coll 5 6$ are linearly dependent.

Note that either of the first two can be expressed as a linear combination of the other, but the third one is not a linear combination of the first two.

**Example:** Are the vectors $\ve_1 = \coll 0 1$ and $\ve_2 = \coll 1 0$
linearly dependent?

**Solution:**
If $c \, \ve_1 + d \, \ve_2 = \vec 0$ and $c \neq 0$, then $\ve_1 = -\frac d c \ve_2$,
which is not possible.
Similarly, if $d \neq 0$, then $\ve_2$ is a multiple of $\ve_1$.
So the only way to have $c \, \ve_1 + d \, \ve_2 = \vec 0$ is with $c = d = 0$.

**Theorem 2.5**: The vectors $\vv_1, \ldots, \vv_k$ are linearly dependent
if and only if at least one of them can be expressed as a linear combination of the others.

**Proof:** We've seen one direction.
For the other, if $\vv_k = c_1 \vv_1 + \cdots c_{k-1} \vv_{k-1}$,
then $c_1 \vv_1 + \cdots c_{k-1} \vv_{k-1} - \vv_k = \vec 0$, so the
vectors are linearly dependent.
The same argument works if it is a different vector that can be expressed
in terms of the others.

**Example:** What about the vectors $\ve_1$, $\ve_2$ and $\coll 0 0$?

**Solution:**
They are linearly dependent, since
$$
0 \coll 1 0 + 0 \coll 0 1 + 1 \coll 0 0 = \coll 0 0 .
$$
**Fact:** Any set of vectors containing the zero vector is linearly dependent.

**Definition:** A set of vectors $\vv_1, \ldots, \vv_k$ is
**linearly independent** if it is not linearly dependent.

Another way to say this is that the system $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ has only the trivial solution $c_1 = \cdots = c_k = 0$.

This is something we know how to figure out! Use **row reduction**!

**Example:** Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$
and $\vw = \colll 6 {-4} {-2}$ linearly independent?

That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {-2} = \vec 0 $$ have a non-trivial solution?

The augmented matrix is
$$
\bmat{rrr|r}
-1 & 2 & 6 & 0 \\
3 & 1 & -4 & 0 \\
2 & 1 & -2 & 0
\emat
\qquad
\text{which row reduces to}
\qquad
\bmat{rrr|r}
-1 & 2 & 6 & 0 \\
0 & 1 & 2 & 0 \\
0 & 0 & 0 & 0
\emat
$$
So what's the answer?
There are 3 variables and 2 leading variables (the rank is 2), so there
is one free variable, which means there are non-trivial solutions.
Therefore, the vectors are linearly **dependent**.

**Example:** Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$
and $\vw = \colll 6 {-4} {\red{3}}$ linearly independent?

That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {\red{3}} = \vec 0 $$ have a non-trivial solution?

The augmented matrix is
$$
\bmat{rrr|r}
-1 & 2 & 6 & 0 \\
3 & 1 & -4 & 0 \\
2 & 1 & \red{3} & 0
\emat
\qquad
\text{which row reduces to}
\qquad
\bmat{rrr|r}
-1 & 2 & 6 & 0 \\
0 & 1 & 2 & 0 \\
0 & 0 & \red{1} & 0
\emat
$$
So what's the answer?
There are 3 variables and 3 leading variables (the rank is 3), so there
are no free variables, which means there is only the trivial solution.
Therefore, the vectors are linearly **independent**.

**Example 2.24:** Are the standard unit vectors $\ve_1, \ldots, \ve_n$ in $\R^n$
linearly independent?

Solution:
The augmented matrix is
$$
\bmat{cccc|c}
1 & 0 & 0 & \cdots & 0 \\
0 & 1 & 0 & \cdots & 0 \\
0 & 0 & 1 & \cdots & 0 \\
\vdots & & & & 0 \\
0 & 0 & \cdots & 1 & 0
\emat
$$
with $n$ rows and $n$ variables. The rank is $n$, so there is only the
trivial solution. So the standard unit vectors are linearly **independent**.

**Note:** You can sometimes see by inspection that some vectors are
linearly dependent, e.g. if they contain the zero vector, or if one
is a scalar multiple of another. Here's one other situation:

**Theorem 2.8:** If $m > n$, then any set of $m$ vectors in $\R^n$ is linearly
dependent.

**Proof:** The system is a homogeneous system with $m$ variables and $n$
equations. By Theorem 2.3, a homogeneous
system with more variables than equations always has a non-trivial solution.

**On whiteboard:** An example like Example 2.25 in the text, and
a discussion of Theorem 2.7.

.