Math 1600A Lecture 29, Section 2, 18 Nov 2013

$ \newcommand{\bdmat}[1]{\left|\begin{array}{#1}} \newcommand{\edmat}{\end{array}\right|} \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\ccollll}[4]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\ccolllll}[5]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \renewcommand{\Re}{\operatorname{Re}} \renewcommand{\Im}{\operatorname{Im}} \renewcommand{\Arg}{\operatorname{Arg}} \renewcommand{\arg}{\operatorname{arg}} \newcommand{\adj}{\textrm{adj}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\query}[1]{\toggle{\text{?}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\text{?}}{#1}\endtoggle} $

Announcements:

Today we start 4.4. Continue reading Section 4.4 for Wednesday. Work through recommended homework questions.

Tutorials: Quiz 5 is this week. It covers Appendix C, 4.1, 4.2 and 4.3
Office hour: Monday, 1:30-2:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.
Course evaluations will be at the start of Wednesday's class.

The final exam will take place on Mon Dec 9, 2-5pm.
Section 001: HSB 236 (last names A-W), HSB 240 (last names X-Z)
Section 002: HSB 240 (last names A-LI), HSB 35 (last names LIU-Z)
The final exam will cover all the material from the course, but will emphasize the later material. See the course home page for final exam conflict policy. You should already have notified the registrar or your Dean (and me) of any conflicts!

Partial review of Section 4.3

The eigenvalues of a square matrix $A$ can be computed as the roots (also called zeros) of the characteristic polynomial $$ \det ( A - \lambda I) $$

Theorem D.2 (The Factor Theorem): Let $f$ be a polynomial and let $a$ be a constant. Then $a$ is a root of $f(x)$ (i.e. $f(a) = 0$) if and only if $x - a$ is a factor of $f(x)$ (i.e. $f(x) = (x - a) g(x)$ for some polynomial $g$).

The largest $k$ such that $(x-a)^k$ is a factor of $f$ is called the multiplicity of the root $a$ in $f$.

Example: Let $f(x) = x^2 - 2x + 1$. Since $f(1) = 1 - 2 + 1 = 0$, $1$ is a root of $f$. And since $f(x) = (x-1)^2$, $1$ has multiplicity $2$.

In the case of an eigenvalue, we call its multiplicity in the characteristic polynomial the algebraic multiplicity of this eigenvalue.

We also define the geometric multiplicity of an eigenvalue $\lambda$ to be the dimension of the corresponding eigenspace $E_\lambda$.

Theorem 4.15: The eigenvalues of a triangular matrix are the entries on its main diagonal (repeated according to their algebraic multiplicity).

Theorem D.4 (The Fundamental Theorem of Algebra): A polynomial of degree $n$ has at most $n$ distinct roots. In fact, the sum of the multiplicities is at most $n$.

Therefore:

Theorem: An $n \times n$ matrix $A$ has at most $n$ distinct eigenvalues. In fact, the sum of the algebraic multiplicities is at most $n$.

New material: complex eigenvalues and eigenvectors

This material isn't covered in detail in the text.

Example 4.7: Find the eigenvalues of $A = \bmat{rr} 0 & -1 \\ 1 & 0 \emat$ (a) over $\R$ and (b) over $\C$.

Solution: We must solve $$ 0 = \det(A-\lambda I) = \det \bmat{cc} -\lambda & -1 \\ 1 & -\lambda \emat = \lambda^2 + 1 . $$ (a) Over $\R$, there are no solutions, so $A$ has no real eigenvalues. This is why the Theorem above says "at most $n$".

(b) Over $\C$, the solutions are $\lambda = i$ and $\lambda = -i$. For example, the eigenvectors for $\lambda = i$ are the nonzero complex multiples of $\coll i 1$, since $$ \bmat{rr} 0 & -1 \\ 1 & 0 \emat \coll i 1 = \coll {-1} i = i \coll i 1 . $$ In fact, $\lambda^2 + 1 = (\lambda - i)(\lambda + i)$, so each of these eigenvalues has algebraic multiplicity 1. So in this case the sum of the algebraic multiplicities is exactly 2.

The Fundamental Theorem of Algebra can be extended to say:

Theorem D.4 (The Fundamental Theorem of Algebra): A polynomial of degree $n$ has at most $n$ distinct complex roots. In fact, the sum of their multiplicities is exactly $n$.

Another way to put it is that over the complex numbers, every polynomial factors into linear factors.

Real matrices

Notice that $i$ and $-i$ are complex conjugates of each other.

If the matrix $A$ has only real entries, then the characteristic polynomial has real coefficients. Say it is $$ \kern-6ex \det(A - \lambda I) = a_n \lambda^n + a_{n-1} \lambda^{n-1} + \cdots + a_1 \lambda + a_0 , $$ with all of the $a_i$'s real numbers. If $\alpha$ is an eigenvalue, then so is its complex conjugate $\bar{\alpha}$, because $$ \kern-8ex \begin{aligned} &a_n \bar{\alpha}^n + a_{n-1} \bar{\alpha}^{n-1} + \cdots + a_1 \bar{\alpha} + a_0 \\ &\quad\qquad\qquad= \overline{a_n {\alpha}^n + a_{n-1} {\alpha}^{n-1} + \cdots + a_1 {\alpha} + a_0} = \bar{0} = 0. \end{aligned} $$

Theorem: The complex eigenvalues of a real matrix come in conjugate pairs.

Complex matrices

A complex matrix might have real or complex eigenvalues, and the complex eigenvalues do not have to come in conjugate pairs.

Examples: $\bmat{rr} 1 & 2 \\ 0 & i \emat$, $\bmat{rr} 1 & i \\ 0 & 2 \emat$.

General case

In general, don't forget that the quadratic formula $$ x = {-b \pm \sqrt{b^2-4ac} \over 2a} $$ gives the roots of $a x^2 + b x + c$, and these can be real (if $b^2 - 4ac \geqslant 0$) or complex (if $b^2 - 4ac < 0$).

And try small integers first.

Example: Find the real and complex eigenvalues of $A = \bmat{rrr} 2 & 3 & 0 \\ 1 & 2 & 2 \\ 0 & -2 & 1 \emat$.

Solution: $$ \kern-6ex \begin{aligned} \bdmat{ccc} 2-\lambda & 3 & 0 \\ 1 & 2-\lambda & 2 \\ 0 & -2 & 1-\lambda \edmat &= (2 - \lambda) \bdmat{cc} 2 - \lambda & 2 \\ -2 & 1-\lambda \edmat - 3 \bdmat{cc} 1 & 2 \\ 0 & 1-\lambda \edmat \\ &= (2 - \lambda) ( \lambda^2 - 3 \lambda + 6 ) - 3 (1-\lambda) \\ &= - \lambda^3 + 5 \lambda^2 - 9 \lambda + 9 . \end{aligned} $$ By trial and error, $\lambda = 3$ is a root. So we factor: $$ - \lambda^3 + 5 \lambda^2 - 9 \lambda + 9 = (\lambda - 3)(\query{-} \lambda^2 + \query{2} \lambda \toggle{+ \text{?}}{-3}\endtoggle) $$ We don't find any obvious roots for the quadratic factor, so we use the quadratic formula: $$ \kern-6ex \begin{aligned} \lambda &= \frac{-2 \pm \sqrt{2^2 - 4(-1)(-3)}}{-2} = \frac{-2 \pm \sqrt{-8}}{-2} \\ &= \frac{-2 \pm 2 \sqrt{2} \, i}{-2} = 1 \pm \sqrt{2} \, i . \end{aligned} $$ So the eigenvalues are $3$, $1 + \sqrt{2} \, i$ and $1 - \sqrt{2} \, i$.

Note: Our questions always involve real eigenvalues and real eigenvectors unless we say otherwise. But there will be problems where we ask for complex eigenvalues.

More review: Eigenvalues of powers and inverses

Theorem 4.18: If $\vx$ is an eigenvector of $A$ with eigenvalue $\lambda$, then $\vx$ is an eigenvector of $A^k$ with eigenvalue $\lambda^k$. This holds for each integer $k \geq 0$, and also for $k < 0$ if $A$ is invertible.

We saw that this was useful computationally. We also saw:

Theorem 4.20: If $\vv_1, \vv_2, \ldots, \vv_m$ are eigenvectors of $A$ corresponding to distinct eigenvalues $\lambda_1, \lambda_2, \ldots, \lambda_m$, then $\vv_1, \vv_2, \ldots, \vv_m$ are linearly independent.

We saw that sometimes the eigenvectors span $\R^n$, and sometimes they don't.

Section 4.4: Similarity and Diagonalization

We're going to introduce a new concept that will turn out to be closely related to eigenvalues and eigenvectors.

Definition: Let $A$ and $B$ be $n \times n$ matrices. We say that $A$ is similar to $B$ if there is an invertible matrix $P$ such that $P^{-1} A P = B$. When this is the case, we write $A \sim B$.

It is equivalent to say that $AP = PB$ or $A = PBP^{-1}$.

Example 4.22: Let $A = \bmat{rr} 1 & 2 \\ 0 & -1 \emat$ and $B = \bmat{rr} 1 & 0 \\ -2 & -1 \emat$. Then $A \sim B$, since $$ \bmat{rr} 1 & 2 \\ 0 & -1 \emat \bmat{rr} 1 & -1 \\ 1 & 1 \emat = \bmat{rr} 1 & -1 \\ 1 & 1 \emat \bmat{rr} 1 & 0 \\ -2 & -1 \emat. $$ We also need to check that the matrix $P = \bmat{rr} 1 & -1 \\ 1 & 1 \emat$ is invertible, which is the case since its determinant is $2$.

It is tricky in general to find such a $P$ when it exists. We'll learn a method that works in a certain situation in this section.

Theorem 4.21: Let $A$, $B$ and $C$ be $n \times n$ matrices. Then:
a. $A \sim A$.
b. If $A \sim B$ then $B \sim A$.
c. If $A \sim B$ and $B \sim C$, then $A \sim C$.

Proof: (a) $I^{-1} A I = A$

(b) Suppose $A \sim B$. Then $P^{-1}AP = B$ for some invertible matrix $P$. Then $PBP^{-1} = A$. Let $Q = P^{-1}$. Then $Q^{-1}BQ = A$, so $B \sim A$.

(c) Exercise.$\quad\Box$

Similar matrices have a lot of properties in common.

Theorem 4.22: Let $A$ and $B$ be similar matrices. Then:
a. $\det A = \det B$
b. $A$ is invertible iff $B$ is invertible.
c. $A$ and $B$ have the same rank.
d. $A$ and $B$ have the same characteristic polynomial.
e. $A$ and $B$ have the same eigenvalues.

Proof: Assume that $P^{-1}AP = B$ for some invertible matrix $P$.

We discussed (a) last time: $$ \begin{aligned} \det(B) &= \det(P^{-1}AP) = \det(P^{-1})\det(A)\det(P)\\ &= \frac{1}{\det (P)} \det(A) \det(P) = \det A . \end{aligned} $$ (b) follows immediately.

(c) takes a bit of work and will not be covered.

(d) follows from (a): since $B - \lambda I = P^{-1} A P - \lambda I = P^{-1} (A - \lambda I) P$ it follows that $B - \lambda I$ and $A - \lambda I$ have the same determinant.

(e) follows from (d).$\quad\Box$

Question: Are $\bmat{rr} 1 & 2 \\ 3 & 4 \emat$ and $\bmat{rr} 1 & 1 \\ 2 & -1 \emat$ similar?

Question: Are $\bmat{rr} 1 & 1 \\ 0 & 1 \emat$ and $\bmat{rr} 1 & 0 \\ 0 & 1 \emat$ similar?

See also Example 4.23(b) in text.

Diagonalization

Definition: $A$ is diagonalizable if it is similar to some diagonal matrix.

Example 4.24: $A = \bmat{rr} 1 & 3 \\ 2 & 3 \emat$ is diagonalizable. Take $P = \bmat{rr} 1 & 3 \\ 1 & -2 \emat$. Then $$ P^{-1} A P = \cdots = \bmat{rr} 4 & 0 \\ 0 & -1 \emat $$ If $A$ is similar to a diagonal matrix $D$, then $D$ must have the eigenvalues of $A$ on the diagonal. But how to find $P$?

Theorem 4.23: Let $A$ be an $n \times n$ matrix. Then $A$ is diagonalizable if and only if $A$ has $n$ linearly independent eigenvectors.

More precisely, there exist an invertible matrix $P$ and a diagonal matrix $D$ with $P^{-1}AP = D$ if and only if the columns of $P$ are $n$ linearly independent eigenvectors of $A$ and the diagonal entries of $D$ are the corresponding eigenvalues in the same order.

Proof: Suppose $\vp_1, \vp_2, \ldots, \vp_n$ are $n$ linearly independent eigenvectors of $A$, and let $P = [\,\vp_1\,\vp_2\,\cdots\,\vp_n\,]$. Write $\lambda_i$ for the $i$th eigenvalue, so $A \vp_i = \lambda_i \vp_i$ for each $i$, and let $D$ be the diagonal matrix with the $\lambda_i$'s on the diagonal. Then $$ \begin{aligned} AP &= A[\,\vp_1\,\vp_2\,\cdots\,\vp_n\,] = [\,\lambda_1\!\vp_1\ \ \lambda_2\!\vp_2\ \,\cdots\ \lambda_n\!\vp_n\,] \\ &= [\,\vp_1\,\vp_2\,\cdots\,\vp_n\,] \bmat{cccc} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \emat \end{aligned} $$ so $P^{-1} A P = D$, as required.

On the other hand, if $P^{-1} A P = D$ and $D$ is diagonal, then $AP = PD$, and if follows from an argument like the one above that the columns of $P$ are eigenvectors of $A$, and the eigenvalues are the diagonal entries of $D$.$\quad\Box$

This theorem is one of the main reasons we want to be able to find eigenvectors of a matrix. Moreover, the more eigenvectors the better, so this motivates allowing complex eigenvectors. We're going to say a lot more about diagonalization.