**eigenvector**as a vector whose direction doesn't change from the transformation. It may be scaled or negated, but the vector always stays on its line.

An

**eigenvalue**is how much it is scaled. For example, if a vector $\begin{bmatrix}2 \\ 3\end{bmatrix}$ becomes $\begin{bmatrix}3 \\ 2\end{bmatrix}$ after a transformation, it's not an eigenvector. However, if it becomes $\begin{bmatrix}-4 \\ -6\end{bmatrix}$, it

**is**an eigenvector, and its eigenvalue is -2.

# Formal Definition

**x**such that A

**x**= $\lambda$

**x**for some scalar $\lambda$, called an eigenvalue ONLY if there is a nontrivial (x is nonzero) solution of that equation.

Note- $\vec{0}$ is never an eigenvector, because that works for every possible matrix.

# Finding Eigenvectors and Eigenvalues

We're going to dedicate an entire post to calculating eigenvalues and eigenvectors, for now it's important to know this:

## Theorem: Triangular Matrix

The eigenvalues of a triangular matrix are the entries on its main diagonal.

### Justification (Not full-on proof)

## Theorem 2: Eigenvectors corresponding to distinct eigenvalues are linearly independent

### Application

This lets us construct vectors as linear combinations of eigenvectors, which lets us calculate matrix multiplication really easily.