Math 1600 Lecture 12, Section 2, 1 Oct 2014

$ \newcommand{\bdmat}[1]{\left|\begin{array}{#1}} \newcommand{\edmat}{\end{array}\right|} \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\ccollll}[4]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\ccolllll}[5]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \renewcommand{\Re}{\operatorname{Re}} \renewcommand{\Im}{\operatorname{Im}} \renewcommand{\Arg}{\operatorname{Arg}} \renewcommand{\arg}{\operatorname{arg}} \newcommand{\adj}{\textrm{adj}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\smalltext}[1]{{\small\text{#1}}} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\querytext}[1]{\toggle{\text{?}\vphantom{\text{#1}}}{\text{#1}}\endtoggle} \newcommand{\query}[1]{\toggle{\text{?}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\text{?}}{#1}\endtoggle} \newcommand{\bv}{\mathbf{v}} %\require{AMScd} $

Announcements:

Read Sections 3.0 and 3.1 for next class. (2.5 is not covered.) Work through recommended homework questions.

Quiz 3 is this week, and will focus on the material in Sections 2.2 and 2.3.

Office hour: Today, 11:30-noon, MC103B.

Help Centers: Monday-Friday 2:30-6:30 in MC 106.

Partial review of Lectures 10 and 11:

Linear combinations

Theorem 2.4: Write $A$ for the matrix with columns $\vv_1, \vv_2, \ldots, \vv_k$. Then $\vv$ is a linear combination of $\vv_1, \vv_2, \ldots, \vv_k$ if and only if the system with augmented matrix $[A \mid \vv \,]$ is consistent.

And we know how to determine whether a system is consistent! Use row reduction!

 

Spanning Sets of Vectors

Definition: If $S = \{ \vv_1, \ldots, \vv_k \}$ is a set of vectors in $\R^n$, then the set of all linear combinations of $\vv_1, \ldots, \vv_k$ is called the span of $\vv_1, \ldots, \vv_k$ and is denoted $\span(\vv_1, \ldots, \vv_k)$ or $\span(S)$.
If $\span(S) = \R^n$, then $S$ is called a spanning set for $\R^n$.

Example: $\span(\ve_1, \ve_2, \ldots, \ve_n) = \R^n$.

Example: The span of $\vu = \colll 1 2 3$ and $\vv = \colll 4 5 6$ is the plane through the origin in $\R^3$ with direction vectors $\vu$ and $\vv$.

 

Linear Dependence and Independence

Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly dependent if there are scalars $c_1, \ldots, c_k$, at least one of which is nonzero, such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ If the only solution to this system is the trivial solution $c_1 = c_2 = \cdots = c_k = 0$, then the set of vectors is said to be linearly independent. Another way to say this is:

Theorem 2.6: The homogeneous system $[A \mid \vec 0\,]$ has a non-trivial solution if and only if the columns of $A$ are linearly dependent.

Once again, this is something we know how to figure out! Use row reduction!

Theorem 2.5: The vectors $\vv_1, \ldots, \vv_k$ are linearly dependent if and only if at least one of them can be expressed as a linear combination of the others.

Fact: Any set of vectors containing the zero vector is linearly dependent.

Linear dependence captures the idea that there is redundancy in the set of vectors: a smaller set will have the same span. Put another way, the vectors will span something smaller than you expect.

Note: You can sometimes see by inspection that some vectors are linearly dependent, e.g. if they contain the zero vector, or if one is a scalar multiple of another. Here's one other situation:

Theorem 2.8: If $m > n$, then any set of $m$ vectors in $\R^n$ is linearly dependent.

Theorem 2.7: Let $\vv_1, \vv_2, \ldots, \vv_m$ be row vectors in $\R^n$, and let $A$ be the $m \times n$ matrix whose rows are these vectors. Then $\vv_1, \vv_2, \ldots, \vv_m$ are linearly dependent if and only if $\rank(A) < m$.

We saw this by doing row reduction on $A$ and keeping track of how each new row is a linear combination of the previous rows. See Example 2.25 in the text.

Questions?

 

New material: Section 2.4: Network Analysis

Example 2.30: Consider a network of water pipes as in the figure to the right.

Some pipes have a known amount of water flowing (measured in litres per minute) and some have an unknown amount. Let's try to figure out the possible flows.

Conservation of flow tells us that at each node, the amount of water entering must equal the amount leaving.

Here are the constraints: $$ \kern-8ex \begin{aligned} &\smalltext{Node A}: &\!\!\! 5 + 10\ &= f_1 + f_4 &\!\!\!\implies\quad f_1 + f_4\ &= 15 \\ &\smalltext{Node B}: &\!\!\! f_1\ &= 10 + f_2 &\!\!\!\implies\quad f_1 - f_2\ &= 10 \\ &\smalltext{Node C}: &\!\!\! f_2 + f_3 + 5\ &= 30 &\!\!\!\implies\quad f_2 + f_3\ &= 25 \\ &\smalltext{Node D}: &\!\!\! f_4 + 20\ &= f_3 &\!\!\!\implies\quad f_3 - f_4\ &= 20 \\ \end{aligned} $$ We row reduce the augmented matrix for the equations on the right: $$ \kern-8ex \bmat{rrrr|r} 1 & 0 & 0 & 1 & 15 \\ 1 & -1 & 0 & 0 & 10 \\ 0 & 1 & 1 & 0 & 25 \\ 0 & 0 & 1 & -1 & 20 \emat \longrightarrow \bmat{rrrr|r} 1 & 0 & 0 & 1 & 15 \\ 0 & 1 & 0 & 1 & 5 \\ 0 & 0 & 1 & -1 & 20 \\ 0 & 0 & 0 & 0 & 0 \emat $$ The solutions are $$ \begin{aligned} f_1\ &= 15 - t \\ f_2\ &= \ph 5 - t \\ f_3\ &= 20 + t \\ f_4\ &= \phantom{20 + {}} t \end{aligned} $$ So if we control flow on AD branch, the others are determined.

In the text, flows are always assumed to be positive, so that places constraints on $t$.

Because of $f_4$, we must have $t \geq 0$.

And from $f_2$, we must have $t \leq 5$.

The other constraints don't add anything, so we find that $0 \leq t \leq 5$.

This lets us determine the minimum and maximum flows: $$ \kern-6ex \begin{aligned} f_1\ &= 15 - t \\ f_2\ &= \ph 5 - t \\ f_3\ &= 20 + t \\ f_4\ &= \phantom{20 + {}} t \end{aligned} \qquad\implies\qquad \begin{aligned} 10\ &\leq f_1 \leq 15 \\ 0\ &\leq f_2 \leq 5 \\ 20\ &\leq f_3 \leq 25 \\ 0\ &\leq f_4 \leq 5 \end{aligned} $$

Exercise 2.4.16:

This figure represents traffic flow on a grid of one-way streets, in vehicles per minute.

Since the same number of vehicles should enter and leave each intersection, we again get a system of equations that must be satisfied.

On board:
(a) Set up and solve system
(b) If $f_4 = 10$, what are other flows?
(c) What are minimum and maximum flows on each street?
(extra) What can you say about how $f_2$ and $f_3$ compare?
(d) What happens if all directions are reversed?
(extra) What happens if the 5 changes to a 0 because of construction?

 

Electrical Networks

In an electrical network, a battery has a voltage $V$ which produces a flow of current $I$ in the wires.

We model devices in the circuit, such as light bulbs and motors, as resistors, because they slow down the flow of current by taking away some of the voltage:

Ohm's Law: voltage drop = resistance (in Ohms) times current (in amps): $$ V = R I .$$ (The book uses $E$ for the voltage drop.)

Kirchhoff's Voltage Law says that the sum of the voltage drops around any closed loop in a circuit is equal to the voltage provided by the batteries in that loop.

On board: Analyze simple circuit, then 2.4.22 (a).

To handle circuits with branching, we need another law. (Draw Exercise 2.4.20 on board.)

Kirchhoff's Current Law says that the sum of the currents flowing into a node equals the sum of the currents leaving, just like for other networks.

On board: Exercises 2.4.20 and 2.4.22 (b).

 

The other applications in Section 2.4, and the short Exploration on GPS after Section 2.4, are also quite interesting, but won't be covered in the course. Next class: Section 3.1.