Based in Maryland, is REVOLUTIONIZINg the world, one post at a time. With topics ranging from Biology to Multivariable Calculus, this website covers the entire academic spectrum.


Back in the previous post, we wrote "similar" (but didn't define it), and we said the characteristic polynomials were the same for two similar matrices. 

Definition of Similar and Diagonalization

MathJax TeX Test Page A is similar to B if there is an invertible matrix $P$ such that $A = PBP^{-1}$. The process of converting A into $PDP^{-1}$ is known as diagonalization. Here's why:

Let's say you have a matrix D. If D = $\begin{bmatrix}5 & 0 \\ 0 & 3\end{bmatrix}$, then $D^k = \begin{bmatrix}5^k & 0 \\ 0 & 3^k\end{bmatrix}$. So that's what the middle matrix is going to look like. In order for the characteristic polynomials to be the same, the elements in the diagonal must be the eigenvalues, with the same multiplicities.

Diagonalization Theorem 

MathJax TeX Test Page An $n \times n$ matrix A is diagonalizable iff A has n linearly independent eigenvectors.
The columns of P are the eigenvector of A.
The columns of D are the corresponding eigenvalues.
MathJax TeX Test Page Diagonalize $\begin{bmatrix}7 & 2\\-4 & 1\end{bmatrix}$

First, we have to find the eigenvalues. $$\begin{vmatrix}7-\lambda & 2\\-4 & 1-\lambda\end{vmatrix} = 0 \to (\lambda - 7)(\lambda - 1) + 8 = 0 \to (\lambda - 5)(\lambda - 3) = 0$$ So, our eigenvalues are $\boxed{3 \text{ and } 5}$.
When the eigenvalue is 5, the matrix is $\begin{bmatrix}2 & 2\\-4 & -4\end{bmatrix}$, so the eigenvector = $\begin{bmatrix}1\\-1\end{bmatrix}$.

When the eigenvalue is 3, the matrix is $\begin{bmatrix}4 & 2\\-4 & -2\end{bmatrix}$, so the eigenvector = $\begin{bmatrix}1\\-2\end{bmatrix}$. Now, we have our P matrix composed of eigenvectors and our D matrix composed of eigenvalues. The inverse of P is calculated with a simple formula. $$\begin{bmatrix}7 & 2\\-4 & 1\end{bmatrix} = \boxed{\begin{bmatrix}1 & 1\\-1 & -2\end{bmatrix} \begin{bmatrix}5 & 0\\0 & 3\end{bmatrix} \begin{bmatrix}2 & 1\\-1 & -1\end{bmatrix}}$$

Theorem #1

An n x n matrix with n distinct eigenvalues is diagonalizable. 

Note: It's important to note that it doesn't have to have n distinct eigenvalues though.

Theorem #2

Part One

For each eigenvalue, the dimension of its eigenspace is less than or equal to its multiplicity.

Part Two

A matrix is diagonalizable iff the sum of the dimensions of the eigenspaces equals n, and this happens only when the dimension of the eigenspace equals the multiplicity for each eigenvalue.

Part Three

If A is diagonalizable, and B_k is the basis for the eigenspace corresponding to the k'th eigenvalue for each k, then collection of all of the vectors in those bases forms an eigenvector basis for R^n.


Raising a matrix to a very high power

MathJax TeX Test Page This equals $$PDP^{-1}PDP^{-1}PDP^{-1}PDP^{-1}PDP^{-1}...$$ $$=PD(P^{-1}P)D(P^{-1}P)D(P^{-1}P)D(P^{-1}P)DP^{-1}...$$ $$=PDIDIDID...P^{-1}$$ $$=PD^kP$$ Recall that raising a diagonal matrix to a power is as easy as raising the elements in the diagonal to the same power.


MathJax TeX Test Page $$A = PDP^{-1}$$ If $\mathcal{B}$ is the basis for $\mathbb{R}^n$ formed by the columns of P, then D is the $\mathcal{B}$-matrix for the transformation x $\rightarrow$ Ax.

Markov Chains

The Characteristic Equation