Today we (really) finish Section 4.4, and start Section 4.6 on linear recurrence relations. Section 4.6 is the last section we cover, and we do not cover the other parts of Section 4.6 (except that we already covered Thm 4.30).
On Wednesday, we finish 4.6, and I'll also do some review problems.
Friday will be entirely review. Bring questions!
Work through suggested exercises. There are additional practice problems on WeBWorK and on the suggested exercises page.
Homework 11 is on Gradescope and is due Friday.
Math Help Centre: M-F 12:30-5:30 in PAB48/49 and online 6pm-8pm.
My next office hour is today 3:30-4:20 in MC130.
We will be adding office hours and review sessions next week.
Definition: Let $A$ and $B$ be $n \times n$ matrices. We say that $A$ is similar to $B$ ($A \sim B$) if there is an invertible matrix $P$ such that $P^{-1} A P = B$.
Theorem 4.22: Let $A$ and $B$ be similar matrices. Then $A$ and $B$ have the same determinant, rank, characteristic polynomial and eigenvalues.
Definition: $A$ is diagonalizable if it is similar to some diagonal matrix.
Theorem 4.24: If $\lambda_1, \ldots, \lambda_k$ are distinct eigenvalues of $A$ and, for each $i$, $\cB_i$ is a basis for the eigenspace $E_{\lambda_i}$, then the union of the $\cB_i$'s is a linearly independent set.
Combining Theorems 4.23 and 4.24 gives the following important consequence:
Theorem: An $n \times n$ matrix is diagonalizable if and only if the sum of the geometric multiplicities of the eigenvalues is $n$.
In particular:
Theorem 4.25: If $A$ is an $n \times n$ matrix with $n$ distinct eigenvalues, then $A$ is diagonalizable.
So it is important to understand the geometric multiplicities better. Here is a helpful result:
Lemma 4.26: If $\lambda_1$ is an eigenvalue of an $n \times n$ matrix $A$, then $$ \kern-6ex \text{geometric multiplicity of } \lambda_1 \leqslant \text{algebraic multiplicity of } \lambda_1 $$
It follows that the only way for the geometric multiplicities to add to $n$ is if they are equal to the algebraic multiplicities and the algebraic multiplicities add to $n$:
Theorem 4.27 (The Diagonalization Theorem):
Let $A$ be an $n \times n$ matrix with distinct eigenvalues
$\lambda_1, \lambda_2, \ldots, \lambda_k$.
Let their geometric multiplicities be $g_1, g_2, \ldots, g_k$
and their algebraic multiplicities be $a_1, a_2, \ldots, a_k$.
Then the following are equivalent:
a. $A$ is diagonalizable.
b. $g_1 + \cdots + g_k = n$.
c. $g_i = a_i$ for each $i$ and $a_1 + \cdots + a_k = n$.
Steps:
1. Compute the characteristic polynomial $\det(A - \lambda I)$ of $A$.
2. Find the roots of the characteristic polynomial and their algebraic
multiplicities by factoring.
3. If the algebraic multiplicities don't add up to $n$, then $A$ is
not diagonalizable, and you can stop. (If you are working over $\C$,
this can't happen.)
4. For each eigenvalue $\lambda$, compute the dimension of the
eigenspace $E_\lambda$. This is the geometric multiplicity of
$\lambda$, and if it is less than the algebraic multiplicity,
then $A$ is not diagonalizable, and you can stop.
5. Compute a basis for the eigenspace $E_\lambda$.
6. If for each eigenvalue the geometric multiplicity equals the
algebraic multiplicity, then you take the $n$ eigenvectors you
found and put them in the columns of a matrix $P$. Put the
eigenvalues in the same order on the diagonal of a matrix $D$.
7. Check that $AP=PD$.
Lemma 4.26: If $\lambda_1$ is an eigenvalue of an $n \times n$ matrix $A$, then $$ \text{geometric multiplicity of } \lambda_1 \leqslant \text{algebraic multiplicity of } \lambda_1 $$
Proof (more direct than in text): Suppose that $\lambda_1$ is an eigenvalue of $A$ with geometric multiplicity $g$, and let $\vv_1, \ldots, \vv_g$ be a basis for $E_{\lambda_1}$, so $$A \vv_i = \lambda_1 \vv_i \qtext{for each $i$.}$$ Let $Q$ be an invertible matrix whose first $g$ columns are $\vv_1, \ldots, \vv_g$: $$ Q = [ \, \vv_1 \, \cdots \, \vv_g \qtext{other vectors} ] $$ (This is possible because we can keep choosing vectors not in the span of the previous ones.) Since $Q^{-1} Q = I$, we know that $Q^{-1} \vv_i = \ve_i$ for $1 \leqslant i \leqslant g$. So we can compute $$ \kern-8ex \begin{aligned} Q^{-1} A Q &= Q^{-1} A [ \, \vv_1 \, \cdots \, \vv_g \qtext{other vectors} ] \\ &= Q^{-1} [ \, A \vv_1 \, \cdots \, A \vv_g \qtext{other vectors} ] \\ &= Q^{-1} [ \, \lambda_1 \vv_1 \, \cdots \, \lambda_1 \vv_g \qtext{other vectors} ] \\ &= [ \, \lambda_1 Q^{-1} \vv_1 \, \cdots \, \lambda_1 Q^{-1} \vv_g \qtext{other vectors} ] \\ &= [ \, \lambda_1 \ve_1 \, \cdots \, \lambda_1 \ve_g \qtext{other vectors} ] \\ &= \bmat{rrrrrrrr} \lambda_1 & 0 & 0 & \cdots & 0 & * & \cdots & * \\ 0 & \lambda_1 & 0 & \cdots & 0 & * & \cdots & * \\ \vdots & & & & \vdots & * & \cdots & * \\ 0 & 0 & \cdots & 0 & \lambda_1 & * & \cdots & * \\ 0 & 0 & \cdots & 0 & 0 & * & \cdots & * \\ \vdots & & & & \vdots & * & \cdots & * \\ 0 & 0 & \cdots & 0 & 0 & * & \cdots & * \emat \end{aligned} $$ Therefore the matrix $Q^{-1} A Q$ has characteristic polynomial $(\lambda_1 - \lambda)^g f(\lambda)$, where $f(\lambda)$ is the characteristic polynomial of the bottom-right region. So $\lambda_1$ is an eigenvalue with algebraic multiplicity at least $g$. But $Q^{-1} A Q$ has the same characteristic polynomial as $A$, so $\lambda_1$ must also have algebraic multiplicity at least $g$ for $A.\quad\Box$.
More generally, $A^k = P D^k P^{-1}$. This is an efficient way to compute powers! Note that we need to know $P$, not just $D$, to do this. Also note that even if $A$ is real, it would work to diagonalize $A$ over $\C$. The answer would be real, but the intermediate calculations would be complex.
Also note that if $A$ is invertible, then $D$ will be as well (why?), and $$ A^{-1} = (P D P^{-1})^{-1} = (P^{-1})^{-1} D^{-1} P^{-1} = P D^{-1} P^{-1}. $$ More generally, $A^k = P D^k P^{-1}$, for any integer $k$.
Example: Let $A = \bmat{rr} 1 & 3 \\ 2 & 2 \emat$. In Example 4.24 we found that $P^{-1} A P = D$, where $P = \bmat{rr} 1 & 3 \\ 1 & -2 \emat$ and $D = \bmat{rr} 4 & 0 \\ 0 & -1 \emat$. Therefore $$ \kern-8ex \begin{aligned} A^5\ &= P D^5 P^{-1} = \frac{1}{\det P} \bmat{rr} 1 & 3 \\ 1 & -2 \emat \bmat{rr} 4^5 & 0 \\ 0 & (-1)^5 \emat \bmat{rr} -2 & -3 \\ -1 & 1 \emat \\[5pt] &= \frac{1}{-5} \bmat{rr} 1 & 3 \\ 1 & -2 \emat \bmat{rr} 1024 & 0 \\ 0 & -1 \emat \bmat{rr} -2 & -3 \\ -1 & 1 \emat = \cdots = \bmat{rr} 409 & 615 \\ 410 & 614 \emat \end{aligned} $$
See also Example 4.29.
Question 33-1: Let $A = \bmat{rr} 1/2 & 1/3 \\ 1/2 & 2/3 \emat$. (a) Find a clever way to figure out the eigenvalues of $A$ and determine whether it is diagonalizable.
(b) Compute $A^{10} \vx_0$, where $\vx_0 = \coll {1/2} {1/2}$.The Fibonacci numbers are defined by $f_0 = 0$, $f_1 = 1$ and $f_n = f_{n-1} + f_{n-2}$ for $n \geq 2$. So we have: $$ \begin{aligned} f_0 &= 0 \\ f_1 &= 1 \\ f_2 &= f_1 + f_0 = 1 + 0 = 1 \\ f_3 &= f_2 + f_1 = 1 + 1 = 2 \\ f_4 &= f_3 + f_2 = 2 + 1 = 3 \\ f_5 &= f_4 + f_3 = 3 + 2 = 5 \\ f_6 &= f_5 + f_4 = 5 + 3 = 8 \end{aligned} $$
(1) $x_0 = a_0$, $x_1 = a_1$, $\ldots$, $x_{k-1} = a_{k-1}$, for scalars $a_0, \ldots, a_{k-1}$.
(2) For $n \geq k$, $x_n = c_1 x_{n-1} + c_2 x_{n-2} + \cdots + c_k x_{n-k}\strut$,
where $c_1, \ldots, c_k$ are scalars with $c_k \neq 0$.
The equations in (1) are called the initial conditions.
The Fibonacci numbers come from a linear recurrence relation of order 2.
Example 33-2: Suppose $x_0 = 4$, $x_1 = 1$ and $x_n = 2 x_{n-1} + 3 x_{n-2}$ for $n \geq 2$. Then $x_2 = 2(1) + 3(4) = 14$, $x_3 = 2(14) + 3(1) = 31$, etc.
We'd like to solve this recurrence relation, which means that we find a formula for $x_n$.
Clever idea: Write $\vx_n = \coll {x_n} {x_{n-1}}$ for $n \geq 1$. Then we have: $$ \begin{aligned} x_n &= 2 x_{n-1} + 3 x_{n-2} \\ x_{n-1} &= \phantom{2} x_{n-1} + 0 x_{n-2} \end{aligned} $$ So $\vx_n = A \vx_{n-1}$, where $A = \bmat{rr} 2 & 3 \\ 1 & 0 \emat$.
That means that $\vx_n = A^{n-1} \vx_1$, which we can compute by diagonalizing $A$!
The characteristic polynomial of $A$ is $$ \bdmat{cc} 2-\lambda & 3 \\ 1 & -\lambda \edmat = (2-\lambda)(-\lambda) - 3 = \lambda^2 -2\lambda-3 , $$ which factors as $(\lambda + 1)(\lambda - 3)$. So the eigenvalues are $\lambda = -1$ and $\lambda = 3$, and it's easy to check that these have eigenvectors $\coll 1 {-1}$ and $\coll 3 1$, respectively.
So we have $$ \kern-8ex P = \bmat{rr} 1 & 3 \\ -1 & 1 \emat \qtext{and} P^{-1} A P = D = \bmat{rr} -1 & 0 \\ 0 & 3 \emat . $$ So $A = PDP^{-1}$ and $$ \begin{aligned} A^k = PD^kP^{-1} &= \bmat{rr} 1 & 3 \\ -1 & 1 \emat \bmat{rr} (-1)^k & 0 \\ 0 & 3^k \emat \bmat{rr} 1 & 3 \\ -1 & 1 \emat^{-1} \\[5pt] &= \frac{1}{4} \bmat{rr} 1 & 3 \\ -1 & 1 \emat \bmat{rr} (-1)^k & 0 \\ 0 & 3^k \emat \bmat{rr} 1 & -3 \\ 1 & 1 \emat \\[5pt] &= \frac{1}{4} \bmat{rr} (-1)^k + 3^{k+1} & (-3)(-1)^k + 3^{k+1} \\ (-1)^{k+1}+3^k & (-3)(-1)^{k+1} + 3^k \emat \end{aligned} $$ Therefore $$ \kern-8ex \begin{aligned} \coll {x_n} {x_{n-1}} = \vx_n = A^{n-1} \vx_1 &= \frac{1}{4} \bmat{rr} (-1)^{n-1} + 3^n & (-3)(-1)^{n-1} + 3^n \\ (-1)^n+3^{n-1} & (-3)(-1)^n + 3^{n-1} \emat \coll 1 4 \\[5pt] &= \frac{1}{4} \coll {(-11)(-1)^{n-1} + (5)3^n} {(-11)(-1)^n + (5)3^{n-1}} \end{aligned} $$ So $x_n = \frac{11}{4} (-1)^n + \frac{5}{4} 3^n$.
Notice that $x_n$ is a linear combination of the eigenvalues raised to the $n$th power.
Now suppose that $x_n = a x_{n-1} + b x_{n-2}$ is any degree 2 linear recurrence relation. Then the update process is governed by the matrix $$ \kern-8ex A = \bmat{rr} a & b \\ 1 & 0 \emat \qtext{and} \bdmat{cc} a - \lambda & b \\ 1 & -\lambda \edmat = (a-\lambda)(-\lambda) - b = \lambda^2 -a \lambda - b. $$
(a) If $\lambda_1 \neq \lambda_2$, then $x_n = d_1 \lambda_1^n + d_2 \lambda_2^n$
for some scalars $d_1$ and $d_2$.
(b) If $\lambda_1 = \lambda_2$, then $x_n = d_1 \lambda_1^n + d_2 \red{n} \lambda_1^n$
for some scalars $d_1$ and $d_2$.
In either case, $d_1$ and $d_2$ can be determined using the initial conditions.
Example 33-2 (continued): We have that $x_n = 2 x_{n-1} + 3 x_{n-2}$, and the polynomial $\lambda^2 - 2 \lambda -3$ has roots $-1$ and $3$, so the solution must be of the form $x_n = d_1 (-1)^n + d_2 3^n$. Using the initial conditions, we find the same solution as before. (On board.)