Today we finish 4.3 and start 4.4. Continue **reading** Section 4.4 for next class.
Work through recommended homework questions.

**Final exam:** Monday, December 8, 9am to noon.
See the course home page
for final exam **conflict** policy.
You should **immediately** notify the registrar or your Dean's office
(and your instructor) of any conflicts! (Deadline Nov 21.)

**Help Centers:** Monday-Friday 2:30-6:30 in MC 106.

**Question:** Can you find a nonzero complex number $z$ such that $z^2 = 0$?

**True/False:** If $z$ and $w$ are complex numbers in the first quadrant,
then so is $zw$.

A **complex number** is a number of the form $a + bi$, where $a$
and $b$ are real numbers and $i$ is a symbol such that $i^2 = -1$.

**Addition:** $(a+bi)+(c+di) = (a+c) + (b+d)i$, like vector addition.

**Multiplication:** $(a+bi)(c+di) = (ac-bd) + (ad+bc)i$.

The **conjugate** of $z = a+bi$ is $\bar{z} = a-bi$. Reflection in real axis.
We learned the properties of conjugation.

The **absolute value** or **modulus** of $z = a+bi$ is
$$
\kern-4ex
|z| = |a+bi| = \sqrt{a^2+b^2}, \qtext{the distance from the origin.}
$$
Since $z \bar{z} = |z|^2$, we have that
$$
\kern-4ex
\frac{z \bar{z}}{|z|^2} = 1 \qtext{so} \frac{1}{z} = \frac{\bar{z}}{|z|^2}
\qtext{(for $z \neq 0$)}
$$
This can be used to divide complex numbers:
$$
\kern-4ex
\frac{w}{z} = \frac{w}{z} \frac{\bar{z}}{\bar{z}} = \frac{w \bar{z}}{|z|^2}.
$$

We learned the properties of absolute value. One of them was $|w z| = |w| |z|$.

A complex number $z = a + bi$ can also be expressed in **polar coordinates**
$(r, \theta)$, where $r = |z| \geq 0$ and $\theta$ is such that
$$
\kern-6ex
a = r \cos \theta \qqtext{and} b = r \sin \theta
$$
Then
$$
\kern-6ex
z = r \cos \theta + (r \sin \theta) i = r(\cos \theta + i \sin \theta)
$$

Let $$ \kern-7ex z_1 = r_1(\cos \theta_1 + i \sin\theta_1) \qtext{and} z_2 = r_2(\cos \theta_2 + i \sin\theta_2) . $$ Then $$ \kern-9ex \begin{aligned} z_1 z_2\ &= r_1 r_2 (\cos \theta_1 + i \sin\theta_1) (\cos \theta_2 + i \sin\theta_2) \\ &= r_1 r_2 [(\cos \theta_1 \cos\theta_2 - \sin\theta_1\sin\theta_2) + i(\sin\theta_1 \cos\theta_2 + \cos\theta_1\sin\theta_2)] \\ &= r_1 r_2 [\cos(\theta_1 + \theta_2) +i \sin(\theta_1+\theta_2)] \end{aligned} $$ So $$ \kern-7ex |z_1 z_2| = |z_1| |z_2| \qtext{and} \Arg(z_1 z_2) = \Arg z_1 + \Arg z_2 $$ (up to multiples of $2\pi$).

In particular, if $z = r (\cos \theta + i \sin \theta)$, then
$z^2 = r^2 (\cos (2 \theta) + i \sin (2 \theta))$.
It follows that the two **square roots** of $z$ are
$$
\pm \sqrt{r} (\cos (\theta/2) + i (\sin \theta/2))
$$

If $a$ is a root of a polynomial $f(x)$, then $f(x) = (x-a) g(x)$.
Sometimes, $a$ is also a root of $g(x)$.
Then $f(x) = (x-a)^2 h(x)$. The largest $k$ such that $(x-a)^k$
is a factor of $f$ is called the **multiplicity** of the root $a$ in $f$.

In the case of an eigenvalue, we call its multiplicity in the characteristic
polynomial the **algebraic multiplicity** of this eigenvalue.

**Example:** Let $f(x) = x^2 - 2x + 1$. Since $f(1) = 1 - 2 + 1 = 0$,
$1$ is a root of $f$. And since $f(x) = (x-1)^2$, $1$ has multiplicity $2$.

In the case of an eigenvalue, we call its multiplicity in the characteristic
polynomial the **algebraic multiplicity** of this eigenvalue.

**Theorem D.4 (The Fundamental Theorem of Algebra):**
A polynomial of degree $n$ has at most $n$ distinct roots.
In fact, the sum of the multiplicities is at most $n$.

Therefore:

**Theorem:**
An $n \times n$ matrix $A$ has at most $n$ distinct eigenvalues.
In fact, the sum of the algebraic multiplicities is at most $n$.

**Example 4.7:** Find the eigenvalues of $A = \bmat{rr} 0 & -1 \\ 1 & 0 \emat$
(a) over $\R$ and (b) over $\C$.

**Solution:** We must solve
$$
0 = \det(A-\lambda I) = \det \bmat{cc} -\lambda & -1 \\ 1 & -\lambda \emat = \lambda^2 + 1 .
$$
(a) Over $\R$, there are no solutions, so $A$ has no real eigenvalues.
This is why the Theorem above says "at most $n$".
(This matrix represents rotation by 90 degrees, and we also saw
geometrically that it has no real eigenvectors.)

(b) Over $\C$, the solutions are $\lambda = i$ and $\lambda = -i$.
For example, the eigenvectors for $\lambda = i$ are the nonzero **complex** multiples of $\coll i 1$,
since
$$
\bmat{rr} 0 & -1 \\ 1 & 0 \emat \coll i 1 = \coll {-1} i = i \coll i 1 .
$$
In fact, $\lambda^2 + 1 = (\lambda - i)(\lambda + i)$, so each of these
eigenvalues has algebraic multiplicity 1.
So in this case the sum of the algebraic multiplicities is **exactly** 2.

The Fundamental Theorem of Algebra can be extended to say:

**Theorem D.4 (The Fundamental Theorem of Algebra):**
A polynomial of degree $n$ has at most $n$ distinct **complex** roots.
In fact, the sum of their multiplicities is **exactly** $n$.

Another way to put it is that over the complex numbers, every polynomial
factors into **linear** factors.

If the matrix $A$ has only real entries, then the characteristic polynomial has real coefficients. Say it is $$ \kern-6ex \det(A - \lambda I) = a_n \lambda^n + a_{n-1} \lambda^{n-1} + \cdots + a_1 \lambda + a_0 , $$ with all of the $a_i$'s real numbers. If $z$ is an eigenvalue, then so is its complex conjugate $\bar{z}$, because $$ \kern-8ex \begin{aligned} &a_n \bar{z}^n + a_{n-1} \bar{z}^{n-1} + \cdots + a_1 \bar{z} + a_0 \\[5pt] &\quad\quad= \overline{a_n {z}^n + a_{n-1} {z}^{n-1} + \cdots + a_1 {z} + a_0} = \bar{0} = 0. \end{aligned} $$

**Theorem:** The complex eigenvalues of a **real** matrix come in conjugate pairs.

**Examples:** $\bmat{rr} 1 & 2 \\ 0 & i \emat$, $\bmat{rr} 1 & i \\ 0 & 2 \emat$.

Also don't forget to try small integers first.

**Example:** Find the real and complex eigenvalues of
$A = \bmat{rrr} 2 & 3 & 0 \\ 1 & 2 & 2 \\ 0 & -2 & 1 \emat$.

**Solution:**
$$
\kern-8ex
\begin{aligned}
\bdmat{ccc} 2-\lambda & 3 & 0 \\ 1 & 2-\lambda & 2 \\ 0 & -2 & 1-\lambda \edmat
\ &= (2 - \lambda) \bdmat{cc} 2 - \lambda & 2 \\ -2 & 1-\lambda \edmat - 3 \bdmat{cc} 1 & 2 \\ 0 & 1-\lambda \edmat \\
&= (2 - \lambda) ( \lambda^2 - 3 \lambda + 6 ) - 3 (1-\lambda) \\
&= - \lambda^3 + 5 \lambda^2 - 9 \lambda + 9 .
\end{aligned}
$$
By trial and error, $\lambda = 3$ is a root. So we factor:
$$
- \lambda^3 + 5 \lambda^2 - 9 \lambda + 9
= (\lambda - 3)(\query{-} \lambda^2 + \query{2} \lambda \toggle{+ \text{?}}{-3}\endtoggle)
$$
We don't find any obvious roots for the quadratic factor, so we use the quadratic formula:
$$
\kern-6ex
\begin{aligned}
\lambda\ &= \frac{-2 \pm \sqrt{2^2 - 4(-1)(-3)}}{-2} = \frac{-2 \pm \sqrt{-8}}{-2} \\
&= \frac{-2 \pm 2 \sqrt{2} \, i}{-2} = 1 \pm \sqrt{2} \, i .
\end{aligned}
$$
So the eigenvalues are $3$, $1 + \sqrt{2} \, i$ and $1 - \sqrt{2} \, i$.
The algebraic multiplicities are all $1$, and $1+1+1=3$.

**Note:** Our questions always involve real eigenvalues and
real eigenvectors unless we say otherwise. But there **will**
be problems where we ask for complex eigenvalues.

**Definition:** Let $A$ and $B$ be $n \times n$ matrices. We say that
$A$ is **similar** to $B$ if there is an invertible matrix $P$ such that $P^{-1} A P = B$.
When this is the case, we write $A \sim B$.

It is equivalent to say that $AP = PB$ or $A = PBP^{-1}$.

**Example 4.22:** Let $A = \bmat{rr} 1 & 2 \\ 0 & -1 \emat$ and $B = \bmat{rr} 1 & 0 \\ -2 & -1 \emat$.
Then $A \sim B$, since
$$
\bmat{rr} 1 & 2 \\ 0 & -1 \emat \bmat{rr} 1 & -1 \\ 1 & 1 \emat
= \bmat{rr} 1 & -1 \\ 1 & 1 \emat \bmat{rr} 1 & 0 \\ -2 & -1 \emat.
$$
We also need to check that the matrix $P = \bmat{rr} 1 & -1 \\ 1 & 1 \emat$ is
invertible, which is the case since its determinant is $2$.

It is tricky in general to find such a $P$ when it exists. We'll learn a method that works in a certain situation in this section.