Today we finish 4.3 and start 4.4. Continue **reading** Section 4.4 for Wednesday.
Work through recommended homework questions.

**Tutorials:** Quiz 8 covers 4.2, 4.3, and the parts of Appendix D
that we covered in class.

**Help Centers:** Monday-Friday 2:30-6:30 in MC 106.

The **final exam** will take place on Tuesday, April 22, 2-5pm.
All students write in AH201 (Alumni Hall).
The final exam will cover all the material from the course, but will
emphasize the material after the midterm.
See the course home page
for final exam **conflict** policy.
You should **immediately** notify the registrar or your Dean
(and your instructor) of any conflicts!

**Theorem D.2 (The Factor Theorem):** Let $f$ be a polynomial and
let $a$ be a constant. Then $a$ is a root of $f(x)$ (i.e. $f(a) = 0$)
if and only if $x - a$ is a factor of $f(x)$ (i.e. $f(x) = (x - a) g(x)$
for some polynomial $g$).

The largest $k$ such that $(x-a)^k$
is a factor of $f$ is called the **multiplicity** of the root $a$ in $f$.

**Example:** Let $f(x) = x^2 - 2x + 1$. Since $f(1) = 1 - 2 + 1 = 0$,
$1$ is a root of $f$. And since $f(x) = (x-1)^2$, $1$ has multiplicity $2$.

In the case of an eigenvalue, we call its multiplicity in the characteristic
polynomial the **algebraic multiplicity** of this eigenvalue.

We also define the **geometric multiplicity** of an eigenvalue $\lambda$
to be the dimension of the corresponding eigenspace $E_\lambda$.

**Theorem 4.15:** The eigenvalues of a triangular matrix
are the entries on its main diagonal (repeated according to
their algebraic multiplicity).

**Theorem D.4 (The Fundamental Theorem of Algebra):**
A polynomial of degree $n$ has at most $n$ distinct roots.
In fact, the sum of the multiplicities is at most $n$.

Therefore:

**Theorem:**
An $n \times n$ matrix $A$ has at most $n$ distinct eigenvalues.
In fact, the sum of the algebraic multiplicities is at most $n$.

A **complex number** is a number of the form $a + bi$, where $a$
and $b$ are real numbers and $i$ is a symbol such that $i^2 = -1$.

**Addition:** $(a+bi)+(c+di) = (a+c) + (b+d)i$, like vector addition.

**Multiplication:** $(a+bi)(c+di) = (ac-bd) + (ad+bc)i$. (Explain.)

The **conjugate** of $z = a+bi$ is $\bar{z} = a-bi$. Reflection in real axis.
We learned the properties of conjugation.

The **absolute value** or **modulus** $|z|$ of $z = a+bi$ is
$$
\kern-4ex
|z| = |a+bi| = \sqrt{a^2+b^2}, \qtext{the distance from the origin.}
$$
Note that
$$
\kern-7ex
z \bar{z} = (a+bi)(a-bi) = a^2 -abi+abi-b^2 i^2 = a^2 + b^2 = |z|^2
$$
This means that for $z \neq 0$
$$
\kern-4ex
\frac{z \bar{z}}{|z|^2} = 1 \qtext{so} z^{-1} = \frac{\bar{z}}{|z|^2}
$$
This can be used to compute quotients of complex numbers:
$$
\kern-4ex
\frac{w}{z} = \frac{w}{z} \frac{\bar{z}}{\bar{z}} = \frac{w \bar{z}}{|z|^2}.
$$
**Example:**
$$
\kern-8ex
\frac{-1+2i}{3+4i} = \frac{-1+2i}{3+4i} \frac{3-4i}{3-4i} = \frac{5+10i}{3^2+4^2}
= \frac{5+10i}{25} = \frac{1}{5} + \frac{2}{5}i
$$

We learned the properties of absolute value. One of them was $|w z| = |w| |z|$.

A complex number $z = a + bi$ can also be expressed in **polar coordinates**
$(r, \theta)$, where $r = |z| \geq 0$ and $\theta$ is such that
$$
\kern-6ex
a = r \cos \theta \qqtext{and} b = r \sin \theta
$$
Then
$$
\kern-6ex
z = r \cos \theta + (r \sin \theta) i = r(\cos \theta + i \sin \theta)
$$

Let $$ \kern-7ex z_1 = r_1(\cos \theta_1 + i \sin\theta_1) \qtext{and} z_2 = r_2(\cos \theta_2 + i \sin\theta_2) . $$ Then $$ \kern-9ex \begin{aligned} z_1 z_2 &= r_1 r_2 (\cos \theta_1 + i \sin\theta_1) (\cos \theta_2 + i \sin\theta_2) \\ &= r_1 r_2 [(\cos \theta_1 \cos\theta_2 - \sin\theta_1\sin\theta_2) + i(\sin\theta_1 \cos\theta_2 + \cos\theta_1\sin\theta_2)] \\ &= r_1 r_2 [\cos(\theta_1 + \theta_2) +i \sin(\theta_1+\theta_2)] \end{aligned} $$ So $$ \kern-8ex |z_1 z_2| = |z_1| |z_2| \qtext{and} \Arg(z_1 z_2) = \Arg z_1 + \Arg z_2 $$ (up to multiples of $2\pi$).

In particular, if $z = r (\cos \theta + i \sin \theta)$, then
$z^2 = r^2 (\cos (2 \theta) + i \sin (2 \theta))$.
It follows that the two **square roots** of $z$ are
$$
\pm \sqrt{r} (\cos (\theta/2) + i (\sin \theta/2))
$$

**Example 4.7:** Find the eigenvalues of $A = \bmat{rr} 0 & -1 \\ 1 & 0 \emat$
(a) over $\R$ and (b) over $\C$.

**Solution:** We must solve
$$
0 = \det(A-\lambda I) = \det \bmat{cc} -\lambda & -1 \\ 1 & -\lambda \emat = \lambda^2 + 1 .
$$
(a) Over $\R$, there are no solutions, so $A$ has no real eigenvalues.
This is why the Theorem above says "at most $n$".
(This matrix represents rotation by 90 degrees, and we also saw
geometrically that it has no real eigenvectors.)

(b) Over $\C$, the solutions are $\lambda = i$ and $\lambda = -i$.
For example, the eigenvectors for $\lambda = i$ are the nonzero **complex** multiples of $\coll i 1$,
since
$$
\bmat{rr} 0 & -1 \\ 1 & 0 \emat \coll i 1 = \coll {-1} i = i \coll i 1 .
$$
In fact, $\lambda^2 + 1 = (\lambda - i)(\lambda + i)$, so each of these
eigenvalues has algebraic multiplicity 1.
So in this case the sum of the algebraic multiplicities is **exactly** 2.

The Fundamental Theorem of Algebra can be extended to say:

**Theorem D.4 (The Fundamental Theorem of Algebra):**
A polynomial of degree $n$ has at most $n$ distinct **complex** roots.
In fact, the sum of their multiplicities is **exactly** $n$.

Another way to put it is that over the complex numbers, every polynomial
factors into **linear** factors.

If the matrix $A$ has only real entries, then the characteristic polynomial has real coefficients. Say it is $$ \kern-6ex \det(A - \lambda I) = a_n \lambda^n + a_{n-1} \lambda^{n-1} + \cdots + a_1 \lambda + a_0 , $$ with all of the $a_i$'s real numbers. If $z$ is an eigenvalue, then so is its complex conjugate $\bar{z}$, because $$ \kern-8ex \begin{aligned} &a_n \bar{z}^n + a_{n-1} \bar{z}^{n-1} + \cdots + a_1 \bar{z} + a_0 \\[5pt] &\quad= \overline{a_n {z}^n + a_{n-1} {z}^{n-1} + \cdots + a_1 {z} + a_0} = \bar{0} = 0. \end{aligned} $$

**Theorem:** The complex eigenvalues of a **real** matrix come in conjugate pairs.

**Examples:** $\bmat{rr} 1 & 2 \\ 0 & i \emat$, $\bmat{rr} 1 & i \\ 0 & 2 \emat$.

Also don't forget to try small integers first.

**Example:** Find the real and complex eigenvalues of
$A = \bmat{rrr} 2 & 3 & 0 \\ 1 & 2 & 2 \\ 0 & -2 & 1 \emat$.

**Solution:**
$$
\kern-8ex
\begin{aligned}
\bdmat{ccc} 2-\lambda & 3 & 0 \\ 1 & 2-\lambda & 2 \\ 0 & -2 & 1-\lambda \edmat
&= (2 - \lambda) \bdmat{cc} 2 - \lambda & 2 \\ -2 & 1-\lambda \edmat - 3 \bdmat{cc} 1 & 2 \\ 0 & 1-\lambda \edmat \\
&= (2 - \lambda) ( \lambda^2 - 3 \lambda + 6 ) - 3 (1-\lambda) \\
&= - \lambda^3 + 5 \lambda^2 - 9 \lambda + 9 .
\end{aligned}
$$
By trial and error, $\lambda = 3$ is a root. So we factor:
$$
- \lambda^3 + 5 \lambda^2 - 9 \lambda + 9
= (\lambda - 3)(\query{-} \lambda^2 + \query{2} \lambda \toggle{+ \text{?}}{-3}\endtoggle)
$$
We don't find any obvious roots for the quadratic factor, so we use the quadratic formula:
$$
\kern-6ex
\begin{aligned}
\lambda &= \frac{-2 \pm \sqrt{2^2 - 4(-1)(-3)}}{-2} = \frac{-2 \pm \sqrt{-8}}{-2} \\
&= \frac{-2 \pm 2 \sqrt{2} \, i}{-2} = 1 \pm \sqrt{2} \, i .
\end{aligned}
$$
So the eigenvalues are $3$, $1 + \sqrt{2} \, i$ and $1 - \sqrt{2} \, i$.

**Note:** Our questions always involve real eigenvalues and
real eigenvectors unless we say otherwise. But there **will**
be problems where we ask for complex eigenvalues.

**Theorem 4.18:**
If $\vx$ is an eigenvector of $A$ with eigenvalue $\lambda$,
then $\vx$ is an eigenvector of $A^k$ with eigenvalue $\lambda^k$.
This holds for each integer $k \geq 0$, and also for $k < 0$ if
$A$ is invertible.

We saw that this was useful computationally. We also saw:

**Theorem 4.20:** If $\vv_1, \vv_2, \ldots, \vv_m$ are eigenvectors of $A$
corresponding to distinct eigenvalues
$\lambda_1, \lambda_2, \ldots, \lambda_m$, then
$\vv_1, \vv_2, \ldots, \vv_m$ are linearly independent.

We saw that sometimes the eigenvectors span $\R^n$, and sometimes they don't.

**Definition:** Let $A$ and $B$ be $n \times n$ matrices. We say that
$A$ is **similar** to $B$ if there is an invertible matrix $P$ such that $P^{-1} A P = B$.
When this is the case, we write $A \sim B$.

It is equivalent to say that $AP = PB$ or $A = PBP^{-1}$.

**Example 4.22:** Let $A = \bmat{rr} 1 & 2 \\ 0 & -1 \emat$ and $B = \bmat{rr} 1 & 0 \\ -2 & -1 \emat$.
Then $A \sim B$, since
$$
\bmat{rr} 1 & 2 \\ 0 & -1 \emat \bmat{rr} 1 & -1 \\ 1 & 1 \emat
= \bmat{rr} 1 & -1 \\ 1 & 1 \emat \bmat{rr} 1 & 0 \\ -2 & -1 \emat.
$$
We also need to check that the matrix $P = \bmat{rr} 1 & -1 \\ 1 & 1 \emat$ is
invertible, which is the case since its determinant is $2$.

It is tricky in general to find such a $P$ when it exists. We'll learn a method that works in a certain situation in this section.

**Theorem 4.21:** Let $A$, $B$ and $C$ be $n \times n$ matrices. Then:

a. $A \sim A$.

b. If $A \sim B$ then $B \sim A$.

c. If $A \sim B$ and $B \sim C$, then $A \sim C$.

**Proof:** (a) $I^{-1} A I = A$

(b) Suppose $A \sim B$. Then $P^{-1}AP = B$ for some invertible matrix $P$. Then $PBP^{-1} = A$. Let $Q = P^{-1}$. Then $Q^{-1}BQ = A$, so $B \sim A$.

(c) Exercise.$\quad\Box$

Similar matrices have a lot of properties in common.

**Theorem 4.22:**
Let $A$ and $B$ be similar matrices. Then:

a. $\det A = \det B$

b. $A$ is invertible iff $B$ is invertible.

c. $A$ and $B$ have the same rank.

d. $A$ and $B$ have the same characteristic polynomial.

e. $A$ and $B$ have the same eigenvalues.

**Proof:**
Assume that $P^{-1}AP = B$ for some invertible matrix $P$.

We discussed (a) last time: $$ \begin{aligned} \det(B) &= \det(P^{-1}AP) = \det(P^{-1})\det(A)\det(P)\\ &= \frac{1}{\det (P)} \det(A) \det(P) = \det A . \end{aligned} $$ (b) follows immediately.

(c) takes a bit of work and will not be covered.

(d) follows from (a): since $B - \lambda I = P^{-1} A P - \lambda I = P^{-1} (A - \lambda I) P$ it follows that $B - \lambda I$ and $A - \lambda I$ have the same determinant.

(e) follows from (d).$\quad\Box$

**Question:** Are $\bmat{rr} 1 & 2 \\ 3 & 4 \emat$ and $\bmat{rr} 1 & 1 \\ 2 & -1 \emat$
similar?

**Question:** Are $\bmat{rr} 1 & 1 \\ 0 & 1 \emat$ and $\bmat{rr} 1 & 0 \\ 0 & 1 \emat$
similar?

See also Example 4.23(b) in text.

**Definition:** $A$ is **diagonalizable** if it is similar to some diagonal matrix.

**Example 4.24:** $A = \bmat{rr} 1 & 3 \\ 2 & 2 \emat$ is diagonalizable.
Take $P = \bmat{rr} 1 & 3 \\ 1 & -2 \emat$. Then
$$
P^{-1} A P = \cdots = \bmat{rr} 4 & 0 \\ 0 & -1 \emat
$$
If $A$ is similar to a diagonal matrix $D$, then $D$ must have the eigenvalues
of $A$ on the diagonal. But how to find $P$?

On board: notice that the columns of $P$ are eigenvectors for $A$!

More precisely, there exist an invertible matrix $P$ and a diagonal matrix $D$ with $P^{-1}AP = D$ if and only if the columns of $P$ are $n$ linearly independent eigenvectors of $A$ and the diagonal entries of $D$ are the corresponding eigenvalues in the same order.

This theorem is one of the main reasons we want to be able to find eigenvectors of a matrix. Moreover, the more eigenvectors the better, so this motivates allowing complex eigenvectors. We're going to say a lot more about diagonalization.