7 Eigenvalues, Eigenvectors and all that
“It is frequent in mathematics that every instance of a concept of seemingly great generality is in essence the same as a small and concrete special case.”
– Paul Halmos
7.1 Eigenvalues and Eigenvectors
A square n \times n matrix \mathbf{A} is said to have an eigenvalue-eigenvector pair (\lambda, \mathbf{v}) if there is a scalar \lambda \in \mathbb{R} and a vector \mathbf{0} \ne \mathbf{v} \in \mathbb{R}^n such that \mathbf{A} \mathbf{v} = \lambda \mathbf{v}.
You should now notice that a n \times n matrix is singular iff it has 0 as an eigenvalue.
Exercise 7.1
- Find the eigenvalues and eigenvectors of the matrices A=\begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix}, B=\begin{bmatrix} -4 & -4 & 2\\ 3 & 4 & -1 \\ -3 & -2 & 3 \end{bmatrix}
- Find a 3 \times 3 matrix \mathbf{A} such that one of the eigenvalues of \mathbf{A} is zero.
TFAE:
\lambda \in \mathbb{C} is an eigenvalue of $n times n $ matrix \mathbf{A}
(\mathbf{A} - \lambda \mathbf{I}) \mathbf{v}=\mathbf{0} has a non-trivial solution
\mathbf{v} \in \mathcal{N} (\mathbf{A} - \lambda \mathbf{I})
\mathbf{A} - \lambda \mathbf{I} is singular
det (\mathbf{A} - \lambda \mathbf{I}) =0 (characteristic equation)
\textbf{rank} (\mathbf{A} - \lambda \mathbf{I}) < n
- Real matrices can have complex eigenvalues.
- A n \times n matrix has at least one and at most n distinct eigenvalues
- The sum of the eigenvalues of a matrix is equal to its trace.
- The product of the eigenvalues of a matrix is equal to its determinant.
Let \lambda be an eigenvalue of a n \times n matrix \mathbf{A}. Then:
- The algebraic multiplicity of \lambda is its multiplicty as a root of the characteristic equation
- The geometric multiplicity of \lambda is the dimension of the eigenspace \mathcal{N} (\mathbf{A} - \lambda \mathbf{I})
It can be shown that, for any eigenvalue \lambda of a n \times n matrix \mathbf{A}, geometric multiplicty \le algebraic multiplicity.
You should now notice that the eigenspace of an eigenvalue with algebraic multiplicty 1 must be 1-dimensional.
Exercise 7.2 Prove that geometric multiplicty is bounded above by algebraic multiplicity.
7.2 Eigenbasis (A Basis consisting of Eigenvectors)
- If a n \times n matrix has n distinct \textbf{real} eigenvalues, then the corresponding \textbf{real} eigenvectors form a basis of \mathbb{R}^n
- If a n \times n matrix has n distinct \textbf{real or complex} eigenvalues, then the corresponding eigenvectors form a basis of \mathbb{C}^n
Let \mathbf{A} be a n \times n matrix.
TFAE:
For each of the eigenvalues of \mathbf{A}, the algebraic and geometric multiplicites coincide
The eigenvectors span \mathbb{C}^n
7.3 Diagonalization
- Matrices \mathbf{A} and \mathbf{B} are said to be similar if there is a non-singular matrix \mathbf{S} such that \mathbf{B= S^{-1}A S}
- A matrix \mathbf{A} is diagonalizable iff it is similar to a diagonal matrix
A n \times n matrix \mathbf{A} corresponding to a linear transformation T:\mathbb{R}^n \to \mathbb{R}^n is diagonalizable iff there is a diagonal matrix \mathbb{B} of T with respect to some basis.
A matrix is diagonalizable iff it has a basis consisting of eigenvectors
- From the last section we see that a n \times n matrix with n distinct eigenvalues is diagonalizable.
- We also see from the last section that a n \times n matrix is diagonalizable iff the geometric multiplicities of the eigenvalues add up to n
Exercise 7.3
- Let T:\mathbb{R}^2 \to \mathbb{R}^2 be counter-clockwise rotation by \frac{\pi}{2}. Is the matrix corresponding to T diagonalizable?
- Let T:\mathbb{R}^2 \to \mathbb{R}^2 be counter-clockwise rotation by \pi. Is the matrix corresponding to T diagonalizable?
We have figured out exactly when a square matrix is diagonalizable. A related question is : when is a square matrox diagonalizable with an orthonormal eigenbasis. The answer is given by the following theorem.
A (square) matrix is diagonalizable with an orthonormal eigenbasis iff it is symmetric
A symmetric n \times n matrix has n real eigenvalues (counting their algebraic multiplicities)
To orthonormally diagonalize a symmetric matrix, we find the eigenvalues of the matrix and a basis for each eigenspace, and then (using the Gram-Schmidt process) find an orthonormal basis of each eigenspace. Finally, assemble the orthonormal bases found above to form an othonormal eigenbasis.
Exercise 7.4 For the matrix \mathbf{A} below, find an orthogonal matrix \mathbf{S} such that \mathbf{S^{-1}AS} is diagonal A=\begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1& 1 & 1 \end{bmatrix}