**Read** Section 4.2 for next class (Monday).
Work through recommended homework questions.

**Midterms** will be returned in tutorials next week.
Grades should be posted to OWL by today.
Solutions will be posted soon.
Please do not e-mail us about grades until you have seen your midterm.

**Tutorials:** Quiz next week covers until what we get
to on Monday, focusing on the material after the midterm material.
Details Monday.

**Help Centers:** Monday-Friday 2:30-6:30 in MC 106,
but not Thursday and Friday this week (fall break).

**Theorem 3.32:** $[S \circ T] = [S][T]$.

We write $I$ for the **identity transformation**, which is
defined by $I(\vx) = \vx$. Then $[I] = I$, the identity matrix.

**Definition:** Let $S$ and $T$ be linear transformations
from $\R^\red{n}$ to $\R^\red{n}$. Then $S$ and $T$ are **inverse transformations**
if $S \circ T = I$ and $T \circ S = I$.
When this is the case, we say that $S$ and $T$ are **invertible**
and are **inverses**.

The same argument as for matrices shows that an inverse is unique when it exists, so we write $S = T^{-1}$ and $T = S^{-1}$.

**Theorem 3.33:** Let $T : \R^n \to \R^n$ be a linear transformation.
Then $T$ is invertible if and only if $[T]$ is an invertible matrix.
In this case, $[T^{-1}] = [T]^{-1}$.

A **Markov chain** has states $1, 2, \ldots n$ and
**transition probabilities** $P_{ij}$ for moving from state $j$
to state $i$ at each time step.
These probabilities are constant and depend only on the current state.

$P$ is a **stochastic matrix**, which means that it is square,
has non-negative entries, and the columns each sum to $1$.

If $\vx_k$ is the **state vector** after $k$ time steps,
then $\vx_{k+1} = P \vx_k$.

A state $\vx$ with $P \vx = \vx$ is called a **steady state vector**.
That is, $\vx - P\vx = \vec 0$, or $(I - P) \vx = \vec 0$.
To find a non-trivial steady state vector for this Markov chain,
we solve the homogeneous system with coefficient matrix $I-P$.

**Theorem:** Every Markov chain has a non-trivial steady state vector.

This appears in the book as Theorem 4.30 in Section 4.6, but I proved it in class.

More generally, a central problem in linear algebra is to find $\vx$
such that $A \vx$ is a *scalar multiple* of $\vx$.

**Definition:** Let $A$ be an $\red{n} \times \red{n}$ matrix.
A scalar $\lambda$ (lambda) is called an **eigenvalue** of $A$ if
there is a nonzero vector $\vx$ such that $A \vx = \lambda \vx$.
Such a vector $\vx$ is called an **eigenvector** of $A$ corresponding to $\lambda$.

We showed that $\lambda = 1$ is an eigenvalue of every stochastic matrix $P$.

**Example A:** Since
$$
\bmat{rr} 1 & 2 \\ 2 & -2 \emat \coll 2 1 = \coll 4 2 = 2 \coll 2 1 ,
$$
we see that $2$ is an eigenvalue of $\bmat{rr} 1 & 2 \\ 2 & -2 \emat$
with eigenvector $\coll 2 1$.

**Example 4.2:** Show that $5$ is an eigenvalue of $A = \bmat{rr} 1 & 2 \\ 4 & 3 \emat$
and determine all eigenvectors corresponding to this eigenvalue.

**Solution:** We are looking for nonzero solutions to $A \vx = 5 \vx$.
This is the same as $(A - 5I) \vx = \vec 0$, so we compute the coefficient matrix:
$$
\kern-7ex
A - 5I = \bmat{rr} 1 & 2 \\ 4 & 3 \emat - \bmat{rr} 5 & 0 \\ 0 & 5 \emat
= \bmat{rr} -4 & 2 \\ 4 & -2 \emat
$$
The columns are linearly dependent, so the null space of $A-5I$ is nonzero.
So $A \vx = 5 \vx$ has a nontrivial solution, which is what it means
for $5$ to be an eigenvalue.

To find the eigenvectors, we compute the null space of $A - 5I$:
$$
\kern-7ex
[\, A-5I \mid \vec 0\,] = \bmat{rr|r} -4 & 2 & 0 \\ 4 & -2 & 0 \emat
\lra{} \bmat{rr|r} 1 & -1/2 & 0 \\ 0 & 0 & 0 \emat
$$
The solutions are of the form $\ccoll {t/2} t = t \ccoll {1/2} 1$.
So the eigenvectors for the eigenvalue $5$ are the *nonzero* multiples of
${\ccoll {1/2} 1}^{\strut}$.

**Definition:** Let $A$ be an $n \times n$ matrix and let $\lambda$ be
an eigenvalue of $A$. The collection of all eigenvectors corresponding
to $\lambda$, together with the zero vector, is a subspace called the
**eigenspace** of $\lambda$ and is denoted $E_\lambda$. In other words,
$$ E_\lambda = \null(A - \lambda I) . $$

In the above Example, $E_5 = \span\left\{ \ccoll {1/2} 1 \right\}$.

**Example:**
Give an eigenvalue of the matrix
$A = \bmat{rr} 2 & 0 \\ 0 & 2 \emat$ and compute its eigenspace.

Since $A \vx = 2 \vx$ for every $\vx$, $2$ is an eigenvalue, and is the only
eigenvalue. In this case, $E_2 = \R^2$.

**Example:**
If $0$ is an eigenvalue of $A$,
what is another name for $E_0$?

$E_0$ is the null space of $A - 0 I = A$. That is, $E_0 = \null(A)$.

An applet illustrating the transformation $T_A : \R^2 \to \R^2$, for $A$ the $2 \times 2$ matrix shown. The black vector is the input $\vx$, and the blue vector is the output $\color{blue} T_A(\vx) = A \vx$.

$$ $$
$$ $$

Reflection in $x$-axis.
Reflection in $y$-axis.

Projection onto $x$-axis.

Rotation by $90^\circ$ ccw.

Rotate and scale.

Example A from above.

A rank 1 example.

Custom:

(Click to move input vector. Hit 't' to toggle modes. Click on a phrase to the right to change the matrix. Enter four numbers, separated by spaces, for a custom matrix.)

**Other applets:** See also this
java applet.
(Instructions.)
If that doesn't work, here is another applet.

Read Example 4.3 in the text for a $3 \times 3$ example.

Given a specific number $\lambda$, we now know how to check whether $\lambda$ is an eigenvalue: we check whether $A - \lambda I$ has a nontrivial null space. And we can find the eigenvectors by finding the null space.

We also have a geometric way to find **all** eigenvalues $\lambda$, at
least in the $2 \times 2$ case.
Is there an algebraic way to check all $\lambda$ at once?

By the fundamental theorem of invertible matrices, $A - \lambda I$ has a nontrivial null space if and only if it is not invertible. For $2 \times 2$ matrices, we can check invertibility using the determinant!

**Example:** Find all eigenvalues of $A = \bmat{rr} 1 & 2 \\ 2 & -2 \emat$.

**Solution:** We need to find all $\lambda$ such that $\det(A-\lambda I) = 0$.
$$
\kern-6ex
\begin{aligned}
\det(A-\lambda I) &= \det \bmat{cc} 1-\lambda & 2 \\ 2 & -2-\lambda \emat \\
&= (1-\lambda)(-2-\lambda)-4 = \lambda^2 + \lambda - 6 ,
\end{aligned}
$$
so we need to solve the quadratic equation $\lambda^2 + \lambda - 6 = 0$.
This can be factored as $(\lambda - 2)(\lambda + 3) = 0$, and so
$\lambda = 2$ or $\lambda = -3$, the same as we saw above and with the applet.

We proceed to find the eigenvectors for these eigenvalues, by solving $(A - 2I) \vx = \vec 0$ and $(A + 3I) \vx = \vec 0$. On board, if time.

**Appendix D** provides review of polynomials and their solutions.
Look it over now. We'll discuss it in Lecture 26.

See also Example 4.5 in text.

So now we know how to handle the $2 \times 2$ case. To handle larger matrices, we need to learn about their determinants, which is Section 4.2.

We won't discuss eigenvectors and eigenvalues for matrices over $\Z_m$. We will discuss complex numbers $\C$ in a later lecture.