Math 1600A Lecture 11, Section 002, 2 Oct 2013

$ \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} $

Announcements:

Read Section 2.4 for next class. Work through recommended homework questions.

Midterm 1 is tomorrow 7-8:30pm. It covers until the end of Section 2.2, except for linear systems over $\Z_m$. A practice exam is available from the course home page. Last name A-Q must write in NS1, R-Z in NS7. See the missed exam section of the course web page for policies, including for illness.

Tutorials: No quiz, focused on review. Take advantage of them! No quizzes next week either.

Office hour: today, 12:30-1:30, MC103B.

Help Centers: Monday-Friday 2:30-6:30 in MC 106.

These lecture notes now available in pdf format as well, a day or two after each lecture. Be sure to let me know of technical problems.

 


Partial review of Lecture 10:

Linear combinations

Recall: A vector $\vv$ is a linear combination of vectors $\vv_1, \vv_2, \ldots, \vv_k$ if there exist scalars $c_1, c_2, \ldots, c_k$ (called coefficients) such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vv . $$

Example: Is $\colll 4 8 6$ a linear combination of $\colll 4 5 6$ and $\colll 2 1 3$?

That is, can we find scalars $x$ and $y$ such that $$x \colll 4 5 6 + y \colll 2 1 3 = \colll 4 8 6 ?$$

Expanding this into components, this becomes a linear system $$ \begin{aligned} 4 x + 2 y &= 4 \\ 5 x + \ph y &= 8 \\ 6 x + 3 y &= 6 \end{aligned} \qquad\text{with augmented matrix}\qquad \bmat{rr|r} 4 & 2 & 4 \\ 5 & 1 & 8 \\ 6 & 3 & 6 \emat $$ and we already know how to determine whether this system is consistent: use row reduction!

Theorem 2.4: A system with augmented matrix $[A \mid \vb \,]$ is consistent if and only if $\vb$ is a linear combination of the columns of $A$.

This gives a different geometrical way to understand the solutions to a system.

 

 

Spanning Sets of Vectors

Definition: If $S = \{ \vv_1, \ldots, \vv_k \}$ is a set of vectors in $\R^n$, then the set of all linear combinations of $\vv_1, \ldots, \vv_k$ is called the span of $\vv_1, \ldots, \vv_k$ and is denoted $\span(\vv_1, \ldots, \vv_k)$ or $\span(S)$.
If $\span(S) = \R^n$, then $S$ is called a spanning set for $\R^n$.

Example: $\span(\ve_1, \ve_2, \ldots, \ve_n) = \R^n$.

Example: The span of $\vu = \colll 1 2 3$ and $\vv = \colll 4 5 6$ consists of every vector $\vx$ that can be written as $$ \vx = s \vu + t \vv $$ for some scalars $s$ and $t$. Since $\vu$ and $\vv$ are not parallel, this is the plane through the origin in $\R^3$ with direction vectors $\vu$ and $\vv$.

 


New material: Section 2.3: Spanning Sets and Linear Independence

Question: What is $\span(\coll 1 2)$? What is $\span(\coll 1 2, \coll 2 4)$?

Question: We saw that $\span(\coll 1 0, \coll 0 1) = \R^2$. What is $\span(\coll 1 0, \coll 0 1, \coll 2 4)$?

Question: What vector is always in $\span(\vv_1, \vv_2, \ldots, \vv_k)$?

 


Linear Dependence and Independence

Suppose that we have vectors $\vu$, $\vv$ and $\vw$ in $\R^n$ such that $2 \vu + 3 \vv - 2 \vw = \vec 0$. This can be solved for any of the vectors in terms of the others, e.g. $\vu = - \frac{3}{2} \vv + \vw$. We say that these vectors are linearly dependent.

Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly dependent if there are scalars $c_1, \ldots, c_k$, at least one of which is nonzero, such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ Since at least one of the scalars is non-zero, the corresponding vector can be expressed as a linear combination of the others.

Example: $ \coll {-2} 4 - 2 \coll {-1} 2 + 0 \coll 5 6 = \coll 0 0 $, so the vectors $\coll {-2} 4$, $\coll {-1} 2$ and $\coll 5 6$ are linearly dependent.

Note that either of the first two can be expressed as a linear combination of the other, but the third one is not a linear combination of the first two.

Example: Are the vectors $\ve_1 = \coll 0 1$ and $\ve_2 = \coll 1 0$ linearly dependent?

Solution:

Theorem 2.5: The vectors $\vv_1, \ldots, \vv_k$ are linearly dependent if and only if at least one of them can be expressed as a linear combination of the others.

Proof: We've seen one direction. For the other, if $\vv_k = c_1 \vv_1 + \cdots c_{k-1} \vv_{k-1}$, then $c_1 \vv_1 + \cdots c_{k-1} \vv_{k-1} - \vv_k = \vec 0$, so the vectors are linearly dependent. The same argument works if it is a different vector that can be expressed in terms of the others.

Example: What about the vectors $\ve_1$, $\ve_2$ and $\coll 0 0$?

Solution:

Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly independent if it is not linearly dependent.

Another way to say this is that the system $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ has only the trivial solution $c_1 = \cdots = c_k = 0$.

This is something we know how to figure out! Use row reduction!

Example: Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$ and $\vw = \colll 6 {-4} {-2}$ linearly independent?

That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {-2} = \vec 0 $$ have a non-trivial solution?

The augmented matrix is $$ \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 3 & 1 & -4 & 0 \\ 2 & 1 & -2 & 0 \emat \qquad \text{which row reduces to} \qquad \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & 0 & 0 \emat $$ So what's the answer?

Example: Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$ and $\vw = \colll 6 {-4} {\red{3}}$ linearly independent?

That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {\red{3}} = \vec 0 $$ have a non-trivial solution?

The augmented matrix is $$ \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 3 & 1 & -4 & 0 \\ 2 & 1 & \red{3} & 0 \emat \qquad \text{which row reduces to} \qquad \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & \red{1} & 0 \emat $$ So what's the answer?

Example 2.24: Are the standard unit vectors $\ve_1, \ldots, \ve_n$ in $\R^n$ linearly independent?

Solution:

Note: You can sometimes see by inspection that some vectors are linearly dependent, e.g. if they contain the zero vector, or if one is a scalar multiple of another. Here's one other situation:

Theorem 2.8: If $m > n$, then any set of $m$ vectors in $\R^n$ is linearly dependent.

Proof: The system is a homogeneous system with $m$ variables and $n$ equations. By Theorem 2.3, a homogeneous system with more variables than equations always has a non-trivial solution.

On whiteboard: An example like Example 2.25 in the text, and a discussion of Theorem 2.7.

.