Math 1600A Lecture 15, Section 2, 11 Oct 2013

$ \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} $

Announcements:

Read Section 3.3 for Wednesday. Work through recommended homework questions.

Tutorials: Quiz 3 next week will cover to the end of Section 3.2.

Solutions to the midterm are available from the course home page. Class average was 31/40 = 77.5%. Great work! But keep in mind that the material naturally gets much more difficult.

Office hour: next Wednesday, 12:30-1:30, MC103B. (No office hour on Monday.)

Help Centers: Monday-Friday 2:30-6:30 in MC 106. (But not on Monday, Oct 14.)

 


Partial review of Lectures 13 and 14:

Matrix multiplication

Definition: If $A$ is $m \times \red{n}$ and $B$ is $\red{n} \times r$, then the product $C = AB$ is the $m \times r$ matrix whose $i,j$ entry is $$ \begin{aligned} c_{ij} &= a_{i\red{1}} b_{\red{1}j} + a_{i\red{2}} b_{\red{2}j} + \cdots + a_{i\red{n}} b_{\red{n}j} = \sum_{\red{k}=1}^{n} a_{i\red{k}} b_{\red{k}j} \\ &= \row_i(A) \cdot \col_j(B) . \end{aligned} $$ Here is how I remember the shape of $AB$: $$ \mystack{A}{m \times n} \ \ \mystack{B}{n \times r} \mystack{=}{\strut} \mystack{AB}{m \times r} $$

Note: In particular, if $B$ is a column vector in $\R^n$, then $AB$ is a column vector in $\R^m$. So one thing a matrix $A$ can do is transform column vectors into column vectors. This point of view will be important later.

For the most part, matrix multiplication behaves like multiplication of real numbers, but there are several differences:

We can have $A \neq O$ but $A^k = O$ for some $k > 1$.
We can have $B \neq \pm I$, but $B^4 = I$.
We can have $AB \neq BA$.

But most expected properties do hold:

Theorem 3.3: Let $A$, $B$ and $C$ be matrices of the appropriate sizes, and let $k$ be a scalar. Then:

(a) $A(BC) = (AB)C$(associativity)
(b) $A(B + C) = AB + AC$(left distributivity)
(c) $(A+B)C = AC + BC$(right distributivity)
(d) $k(AB) = (kA)B = A(kB)$(no cool name)
(e) $I_m A = A = A I_n$ if $A$ is $m \times n$ (identity)

New material: Sections 3.1 and 3.2 continued.

Example 3.20: If $A$ and $B$ are square matrices of the same size, is $(A+B)^2 = A^2 + 2 AB + B^2$?

Note: Theorem 3.3 shows that a scalar matrix $kI_n$ commutes with every $n \times n$ matrix $A$.

Partitioned Matricies

Sometimes it is natural to view a matrix is partitioned into blocks. For example: $$ \kern-6ex A = \bmat{rrrrr} 1 & 0 & 0 & 2 & -1 \\ 0 & 1 & 0 & 1 & 3 \\ 0 & 0 & 1 & 4 & 0 \\ 0 & 0 & 0 & 1 & 7 \\ 0 & 0 & 0 & 7 & 2 \emat = \bmat{rrr|rr} 1 & 0 & 0 & 2 & -1 \\ 0 & 1 & 0 & 1 & 3 \\ 0 & 0 & 1 & 4 & 0 \\ \hline 0 & 0 & 0 & 1 & 7 \\ 0 & 0 & 0 & 7 & 2 \emat = \bmat{cc} I & D \\ O & C \emat $$ This can make matrix multiplication much easier when there are blocks that are zero or an identity matrix. For example, if $$ B = \bmat{rr} 0 & 0 \\ 0 & 0 \\ 0 & 0 \\ \hline 1 & 0 \\ 0 & 1 \emat = \bmat{c} O \\ I \emat $$ then $$ \kern-6ex AB = \bmat{cc} I & D \\ O & C \emat \, \bmat{c} O \\ I \emat = \bmat{c} IO + DI \\ O^2 + CI \emat = \bmat{c} D \\ C \emat = \bmat{rr} 2 & -1 \\ 1 & 3 \\ 4 & 0 \\ \hline 1 & 7 \\ 7 & 2 \emat $$ You pretend that the submatrices are numbers and do matrix multiplication. As long as all of the sizes match up, this works.

See Example 3.12 for a larger, more complicated worked example.

The most common (and important) cases are when one or both of the matrices are partitioned into rows or columns. For example, if $A$ is $m \times n$ and $B$ is $n \times r$, and we partition $B$ into its columns as $B = [ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ]$, then we have: $$ AB = A[ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ] = [\, A\vb_1 \mid A\vb_2 \mid \cdots \mid A\vb_r ] , $$ where we think of $A$ and the $\vb_i$'s as scalars. The first column of $AB$ consists of the dot products of the rows of $A$ with the first column $\vb_1$ of $B$.

Example on whiteboard: $2 \times 3$ times $3 \times 2$.

Note that each column of $AB$ is a linear combination of the columns of $A$.

Similarly, if we partition $A$ into rows, we can compute $$ AB = \bmat{c} A_1 \\ \hline A_2 \\ \hline \vdots \\ \hline A_m \emat B = \bmat{c} A_1 B \\ \hline A_2 B \\ \hline \vdots \\ \hline A_m B \emat $$

Same example on whiteboard.

If we partition $A$ into rows and $B$ into columns, we get $$ AB = \bmat{c} A_1 \\ \hline A_2 \\ \hline \vdots \\ \hline A_m \emat [ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ] = \bmat{ccc} A_1 \vb_1 & \cdots & A_1 \vb_r \\ \vdots & & \vdots \\ A_m \vb_1 & \cdots & A_m \vb_r \emat $$ which is just the usual description of $AB$, where the $ij$ entry is the dot product of the $i$th row of $A$ with the $j$th column of $B$!

(Outer products and Example 3.11 not covered.)

 

The Transpose and Symmetric Matrices

Here's another operation on matrices, which has no analog for real numbers:

Definition: The transpose of an $m \times n$ matrix $A$ is the $n \times m$ matrix $A^T$ whose $ij$ entry is the $ji$ entry of $A$.

Example 3.14: The transposes of $$ A = \bmat{rrr} 1 & 3 & 2 \\ 5 & 0 & 1 \emat, \qquad B = \bmat{rr} a & b \\ c & d \emat , \qquad \text{and} \qquad C = \bmat{rrr} 5 & -1 & 2 \emat $$ are $$ A^T = \bmat{rr} 1 & 5 \\ 3 & 0 \\ 2 & 1 \emat, \qquad B^T = \bmat{rr} a & c \\ b & d \emat , \qquad \text{and} \qquad C^T = \bmat{r} 5 \\ -1 \\ 2 \emat . $$ Note that the columns and rows get interchanged.

One use of the transpose is to convert between row vectors and column vectors. In particular, we can use this to express the dot product in terms of matrix multiplication. If $$ \vu = \bmat{c} u_1 \\ u_2 \\ \vdots \\ u_n \emat \qquad\text{and}\qquad \vv = \bmat{c} v_1 \\ v_2 \\ \vdots \\ v_n \emat $$ then $$ \vu^T \vv = [ u_1 \, u_2 \, \cdots \, u_n ] \bmat{c} v_1 \\ v_2 \\ \vdots \\ v_n \emat = u_1 v_1 + \cdots + u_n v_n = \vu \cdot \vv $$

Properties of the transpose

Theorem 3.4: Let $A$ and $B$ be matrices of the appropriate sizes, and let $k$ be a scalar. Then:

(a) $(A^T)^T = A$ (b) $(A+B)^T = A^T + B^T$
(c) $(kA)^T = k(A^T)$    (d) $(AB)^T = B^T A^T$   !
(e) $(A^r)^T = (A^T)^r$ for all nonnegative integers $r$

(a), (b) and (c) are easy to see. (d) is more of a surprise, so it is worth explaining:

Proof of (d): Suppose $A$ is $m \times n$ and $B$ is $n \times r$. Then both of $(AB)^T$ and $B^T A^T$ are $r \times m$. We have to check that the entries are equal: $$ \begin{aligned}\kern-4ex [(AB)^T]_{ij} &= (AB)_{ji} = \row_j(A) \cdot \col_i(B) = \col_j(A^T) \cdot \row_i(B^T) \\ &= \row_i(B^T) \cdot \col_j(A^T) = [(B^T)(A^T)]_{ij} . \qquad\Box %\tag*{∎} \end{aligned} $$

Note that (b) and (d) extend to several matrices. For example: $$ (A + B + C)^T = ((A+B) + C)^T = (A+B)^T + C^T = A^T + B^T + C^T $$ and $$ (ABC)^T = ((AB)C)^T = C^T (AB)^T = C^T B^T A^T $$ In particular, (e) follows: $(A^r)^T = (A^T)^r$.

Symmetric matrices

Definition: A square matrix $A$ is symmetric if $A^T = A$. That is, $A_{ij} = A_{ji}$ for every $i$ and $j$.

Example: $\bmat{rr} 1 & 2 \\ 2 & 3 \emat$ is symmetric.

There are two ways to get a symmetric matrix from a non-symmetric matrix:

1. If $A$ is square, then $A + A^T$ is symmetric. This is because $$ (A + A^T)^T = A^T + (A^T)^T = A^T + A = A + A^T . $$ Example on whiteboard.

2. And if $B$ is any matrix, then $B^T B$ is symmetric. This is because $$ (B^T B)^T = B^T (B^T)^T = B^T B $$ The same kind of argument shows that $B B^T$ is symmetric.

Example on whiteboard.

 

Challenge problems

Find a $3 \times 3$ matrix $A$ such that $A^2 \neq O$ but $A^3 = O$.

Find a $2 \times 2$ matrix $A$ such that $A \neq I_2$ but $A^3 = I_2$.

.