## Announcements:

Read Section 4.2 for next class. Work through recommended homework questions.

Midterms available for pick-up starting Thursday. (If you have a Wednesday tutorial, your TA will be available Thursday or Friday.) Solutions will be posted Thursday.

Drop date: Friday, March 7.

Tutorials: Quiz this week covers Sections 3.5 and 3.6, focusing on 3.6.

Help Centers: Monday-Friday 2:30-6:30 in MC 106.

### New Material: Section 3.7: Markov chains (cont)

Example 3.65: A Markov chain can have more than two states. A rat is in a maze with three rooms, and always chooses to go through one of the doors with equal probability. Draw the state diagram, determine the transition matrix $P$ and describe how to find a steady-state vector.

Theorem: Every Markov chain has a non-trivial steady state vector.

This appears in the book as Theorem 4.30 in Section 4.6.

Proof: Let $P$ be the transition matrix. We want to find a non-trivial solution to $(I - P)\vx = \vec 0$. By the fundamental theorem of invertible matrices and the fact that $\rank (I - P) = \rank ((I - P)^T)$, this is equivalent to $(I - P)^T \vx = \vec 0$ having a non-trivial solution. That is, finding a non-trivial $\vx$ such that $$P^T \vx = \vx \qtext{(since I^T = I).}$$ But since $P$ is a stochastic matrix, we always have $$P^T \ccolll 1 {\vdots} 1 = \ccolll 1 {\vdots} 1$$ So therefore $P \vx = \vx$ also has a (different) non-trivial solution.$\qquad\Box$

We'll probably study Markov chains again in Section 4.6.

### Section 4.1: Eigenvalues and eigenvectors

We saw when studying Markov chains that it was important to find solutions to the system $A \vx = \vx$, where $A$ is a square matrix. We did this by solving $(I-A)\vx = \vec 0$.

More generally, a central problem in linear algebra is to find $\vx$ such that $A \vx$ is a scalar multiple of $\vx$.

Definition: Let $A$ be an $n \times n$ matrix. A scalar $\lambda$ (lambda) is called an eigenvalue of $A$ if there is a nonzero vector $\vx$ such that $A \vx = \lambda \vx$. Such a vector $\vx$ is called an eigenvector of $A$ corresponding to $\lambda$.

We showed that $\lambda = 1$ is an eigenvalue of every stochastic matrix $P$.

Example A: Since $$\bmat{rr} 1 & 2 \\ 2 & -2 \emat \coll 2 1 = \coll 4 2 = 2 \coll 2 1 ,$$ we see that $2$ is an eigenvalue of $\bmat{rr} 1 & 2 \\ 2 & -2 \emat$ with eigenvector $\coll 2 1$.

Example 4.2: Show that $5$ is an eigenvalue of $A = \bmat{rr} 1 & 2 \\ 4 & 3 \emat$ and determine all eigenvectors corresponding to this eigenvalue.

Solution: We are looking for nonzero solutions to $A \vx = 5 \vx$. This is the same as $(A - 5I) \vx = \vec 0$, so we compute the coefficient matrix: $$\kern-7ex A - 5I = \bmat{rr} 1 & 2 \\ 4 & 3 \emat - \bmat{rr} 5 & 0 \\ 0 & 5 \emat = \bmat{rr} -4 & 2 \\ 4 & -2 \emat$$ The columns are linearly dependent, so the null space of $A-5I$ is nonzero. So $A \vx = 5 \vx$ has a nontrivial solution, which is what it means for $5$ to be an eigenvalue.

To find the eigenvectors, we compute the null space of $A - 5I$: $$\kern-7ex [\, A-5I \mid \vec 0\,] = \bmat{rr|r} -4 & 2 & 0 \\ 4 & -2 & 0 \emat \lra{} \bmat{rr|r} 1 & -1/2 & 0 \\ 0 & 0 & 0 \emat$$ The solutions are of the form $\ccoll {t/2} t = t \ccoll {1/2} 1$. So the eigenvectors for the eigenvalue $5$ are the nonzero multiples of ${\ccoll {1/2} 1}^{\strut}$.

Definition: Let $A$ be an $n \times n$ matrix and let $\lambda$ be an eigenvalue of $A$. The collection of all eigenvectors corresponding to $\lambda$, together with the zero vector, is a subspace called the eigenspace of $\lambda$ and is denoted $E_\lambda$. In other words, $$E_\lambda = \null(A - \lambda I) .$$

In the above Example, $E_5 = \span\left\{ \ccoll {1/2} 1 \right\}$.

An applet illustrating the transformation $T_A : \R^2 \to \R^2$, for $A$ the $2 \times 2$ matrix shown. The black vector is the input $\vx$, and the blue vector is the output $\color{blue} T_A(\vx) = A \vx$.

 
Reflection in $x$-axis.
Reflection in $y$-axis.
Projection onto $x$-axis.
Rotation by $90^\circ$ ccw.
Rotate and scale.
Example A from above.
A rank 1 example.
Custom:

(Click to move input vector. Hit 't' to toggle modes. Click on a phrase to the right to change the matrix. Enter four numbers, separated by spaces, for a custom matrix.)

Applet: See also this java applet. (Instructions.) If that doesn't work, here is another applet.

Read Example 4.3 in the text for a $3 \times 3$ example.

### Finding eigenvalues

Given a specific number $\lambda$, we now know how to check whether $\lambda$ is an eigenvalue: we check whether $A - \lambda I$ has a nontrivial null space. And we can find the eigenvectors by finding the null space.

We also have a geometric way to find all eigenvalues $\lambda$, at least in the $2 \times 2$ case. Is there an algebraic way to check all $\lambda$ at once?

By the fundamental theorem of invertible matrices, $A - \lambda I$ has a nontrivial null space if and only if it is not invertible. For $2 \times 2$ matrices, we can check invertibility using the determinant!

Example: Find all eigenvalues of $A = \bmat{rr} 1 & 2 \\ 2 & -2 \emat$.

Solution: We need to find all $\lambda$ such that $\det(A-\lambda I) = 0$. \kern-6ex \begin{aligned} \det(A-\lambda I) &= \det \bmat{cc} 1-\lambda & 2 \\ 2 & -2-\lambda \emat \\ &= (1-\lambda)(-2-\lambda)-4 = \lambda^2 + \lambda - 6 , \end{aligned} so we need to solve the quadratic equation $\lambda^2 + \lambda - 6 = 0$. This can be factored as $(\lambda - 2)(\lambda + 3) = 0$, and so $\lambda = 2$ or $\lambda = -3$, the same as we saw above and with the applet.

We proceed to find the eigenvectors for these eigenvalues, by solving $(A - 2I) \vx = \vec 0$ and $(A + 3I) \vx = \vec 0$. On board, if time.

Appendix D provides review of polynomials and their solutions.

So now we know how to handle the $2 \times 2$ case. To handle larger matrices, we need to learn about their determinants, which is Section 4.2.
We won't discuss eigenvectors and eigenvalues for matrices over $\Z_m$. We will discuss complex numbers $\C$ in a later lecture.