Section 4.4 Eigenvalues and Eigenvectors
Definition 4.30.
An eigenvector of a matrix \(A\) is a nonzero vector \(\vec{x}\) such that \(A\vec{x}=\lambda \vec{x}\) for some scalar \(\lambda\text{.}\) The scalar \(\lambda\) is called an eigenvalue of \(A\) if there exists a nonzero solution to \(A\vec{x}=\lambda \vec{x}\text{.}\)
Question 4.35.
Which of the following vectors are an eigenvector of \(A=\begin{bmatrix} 2\amp 3\\3\amp 2
\end{bmatrix}\text{?}\) For any vectors that are eigenvectors of \(A\text{,}\) give the eigenvalue. To speed things along, we are going to use SageMath cells embedded in the course notes. the code below sets up the computation that you need to do to answer the first part below. You may modify the code and click the button again (or type shift-return) to solve the other parts. If you mess up the code, just reload the page.
(a)
\(\vec{v_1}=\colvec{1\\ 2}\)
(b)
\(\vec{v_2}=\colvec{-1\\ 1}\)
(c)
\(\vec{v_3}=\colvec{3\\ -1}\)
(d)
\(\vec{v_4}=\colvec{1\\ 1}\)
(e)
\(\vec{v_5}=\colvec{0\\ 0}\)
As a hint for the following two problems, it will suffice to try to find an eigenvector of the form \(\colvec{1\\ a}\text{.}\) You might first convince yourself that for these matrices, no eigenvector can have first component \(0\text{.}\)
Question 4.36.
Let \(A=\begin{bmatrix} 2\amp 1\\-1\amp 3 \end{bmatrix}\text{.}\) Try to find an eigenvector with eigenvalue \(3\text{.}\) In other words, find a vector \(\vec{v}\) such that \(A\vec{v}=3\vec{v}\text{.}\)
Question 4.37.
Let \(A=\begin{bmatrix} 3\amp 4\\3\amp -1 \end{bmatrix}\text{.}\) Try to find an eigenvector with eigenvalue \(-3\text{.}\) In other words, find a vector \(\vec{v}\) such that \(A\vec{v}=-3\vec{v}\text{.}\)
Theorem 4.31.
Let \(A\) be a square matrix. We have that \(\det(A- \alpha Id)=0\) iff \(\alpha\) is an eigenvalue of \(A\text{.}\)
If \(A\) is a \(n\) by \(n\) matrix, then \(\det(A- t Id)\) will be a \(n\)-th degree polynomial in \(t\text{,}\) which we call the characteristic polynomial of \(A\). The previous theorem shows that finding roots of the characteristic polynomial is the same as finding eigenvalues.
Question 4.38.
For each of the following matrices: write out the characteristic polynomial, give all eigenvalues, and for each eigenvalue, find an eigenvector. You should do the first two by hand to get a feel for finding the characteristic polynomial. After that, I have provided a SageMath cell you can modify to get the characteristic polynomial quickly, but you will need to work from there to find eigenvalues and eigenvectors.
(a)
\(\begin{bmatrix} 1\amp 1 \\1\amp 1 \end{bmatrix}\)
(b)
\(\begin{bmatrix} 1\amp -3 \\-3\amp 1 \end{bmatrix}\)
(c)
\(\begin{bmatrix} 1\amp 2 \\3\amp 4
\end{bmatrix}\)
(d)
\(\begin{bmatrix} 1\amp 2\amp 3 \\4\amp 5\amp 6\\7\amp 8\amp 9 \end{bmatrix}\)
(e)
\(\begin{bmatrix} 4\amp -1\amp 6\\2\amp 1\amp 6\\2\amp -1\amp 8 \end{bmatrix}\)
(f)
\(\begin{bmatrix} 1\amp 1\amp 0\amp 0\\1\amp
1\amp 0\amp 0\\0\amp 0\amp 1\amp -3\\0\amp 0\amp -3\amp 1
\end{bmatrix}\)
Hint.
Work smarter, not harder, on this part!
A root \(\alpha\) of a polynomial (in \(t\)) has (algebraic) multiplicity \(k\) if \(k\) is the largest integer such that \((t-\alpha)^k\) is a factor. Which, if any, of the eigenvalues you found above have algebraic multiplicity greater than \(1\text{?}\)
Question 4.39.
Prove that a nonzero vector, \(\vec{v}\text{,}\) is an eigenvector of \(A\) with eigenvalue \(\lambda\) if and only if \(\vec{v}\) is in the null space of \(A-\lambda Id\text{.}\)
Solution.
\((\Rightarrow)\) If \(\vec{v}\) is an eigenvector of \(A\) with eigenvalue \(\lambda\text{,}\) then \(A\vec{v} =
\lambda\vec{v}\text{.}\) By algebra, this means that \(A\vec{v} -
\lambda\vec{v} = \vec{0}\text{,}\) and hence \((A-\lambda
Id)\vec{v}=\vec{0}\text{.}\) Thus, \(\vec{v}\) is in the null space of \(A-\lambda Id\text{.}\)
\((\Leftarrow)\) If \(\vec{v}\in Null(A-\lambda Id)\text{,}\) then \((A-\lambda Id)\vec{v} = 0\text{.}\) Hence, \(A\vec{v}-\lambda
\vec{v} = \vec{0}\text{,}\) or \(A\vec{v} = \lambda\vec{v}\text{.}\) Thus, \(\vec{v}\) is an eigenvector of \(A\) with eigenvalue \(\lambda\text{.}\)
Question 4.40.
Prove that if \(\vec{v}\) is an eigenvector of \(A\text{,}\) then \(\alpha \vec{v}\) is also an eigenvector of \(A\) (when \(\alpha \neq 0\)).
Solution.
Since \(\vec{v}\) is an eigenvector of \(A\text{,}\) there is a scalar \(\lambda\) such that \(A\vec{v} = \lambda
\vec{v}\text{.}\) By properties of matrix multiplication, we thus have
\begin{equation*}
A(\alpha\vec{v}) = \alpha A\vec{v} = \alpha\lambda\vec{v} =
\lambda(\alpha\vec{v})\text{.}
\end{equation*}
As \(\alpha\neq 0\text{,}\) this shows that \(\alpha\vec{v}\) is an eigenvector with the same eigenvalue.
Question 4.41.
Prove that if \(\vec{v_1}\) and \(\vec{v_2}\) are eigenvectors of \(A\) with the same eigenvalue, then \(\vec{v_1}+\vec{v_2}\) is also an eigenvector of \(A\text{.}\) What is the eigenvalue of \(\vec{v_1}+\vec{v_2}\text{?}\)
Solution.
Let \(\lambda\) be the associated eigenvalue. We have that \(A\vec{v}_1 = \lambda\vec{v}_1\) and \(A\vec{v}_2 =
\lambda\vec{v}_2\text{.}\) Thus, we have
\begin{equation*}
A(\vec{v}_1 + \vec{v}_2) =
A\vec{v}_1 + A\vec{v}_2 = \lambda\vec{v}_1 + \lambda\vec{v}_2 =
\lambda(\vec{v}_1 + \vec{v}_2)\text{.}
\end{equation*}
Therefore, \(\vec{v}_1 +
\vec{v}_2\) is an eigenvector with the same eigenvalue.
Definition 4.32.
If \(\lambda\) is an eigenvalue of \(A\text{,}\) then the eigenspace of \(\lambda\), \(E_\lambda\text{,}\) is the set of vectors \(\vec{x}\) such that \((A-\lambda
Id_n)\vec{x}=\vec{0}\text{.}\)
The previous two questions along with the inclusion of \(\vec{0}\) have proved the following theorem.
Theorem 4.33.
If \(\lambda\) is an eigenvalue of \(A \in M_{n
\times n}\text{,}\) then \(E_\lambda\) is a subspace of \(\mathbb{R}^n\text{.}\)
Question 4.42.
Prove that \(dim(E_\lambda) \geq 1\) for every eigenvalue \(\lambda\text{.}\)
Solution.
We prove this by contradiction. Suppose that there is an eigenspace with dimension \(0\text{.}\) Then \(E_\lambda =
\{\vec{0}\}\text{,}\) since this is the only vector space of dimension \(0\text{.}\) However, we now see that \(E_\lambda\) does not contain any nonzero vectors, and thus cannot contain any eigenvectors and therefore is not an eigenspace.
Question 4.43.
(a)
Let \(A =\begin{bmatrix} 2 \amp a\amp
b\\0\amp 2\amp c\\0\amp 0\amp 2 \end{bmatrix}\text{.}\) Show that \(A\) only has an eigenvalue of 2. What is the algebraic multiplicity of the eigenvalue 2?
(b)
Can you pick \(a\text{,}\) \(b\text{,}\) and \(c\text{,}\) so that the eigenspace of 2 has dimension 3? If so, give a choice of \(a\text{,}\) \(b\text{,}\) and \(c\) that does so.
(c)
Can you pick \(a\text{,}\) \(b\text{,}\) and \(c\text{,}\) so that the eigenspace of 2 has dimension 2? If so, give a choice of \(a\text{,}\) \(b\text{,}\) and \(c\) that does so.
(d)
Can you pick \(a\text{,}\) \(b\text{,}\) and \(c\text{,}\) so that the eigenspace of 2 has dimension 1? If so, give a choice of \(a\text{,}\) \(b\text{,}\) and \(c\) that does so.
Diagonalizability.
Definition 4.34.
A matrix \(A\) is diagonalizable if there exists an invertible matrix \(Q\) such that \(A=QDQ^{-1}\) where \(D\) is a diagonal matrix.
We will not prove this theorem, but we will make use of it:
Theorem 4.35.
A matrix \(A \in M_{n \times n}\) is diagonalizable iff \(A\) has \(n\) linearly independent eigenvectors. In fact, the matrix \(Q\) that will diagonalize \(A\) will have the \(n\) linearly independent eigenvectors as its columns.
The question becomes when can we find \(n\) linearly independent eigenvectors for a matrix \(A\text{.}\) It turns out that if you can find \(n\) linearly independent eigenvectors for \(A\text{,}\) then the matrix \(Q\) has columns given by these eigenvectors and the diagonal matrix will have the eigenvalues on the diagonal. In particular, if the \(i\)-th column of \(Q\) has eigenvalue \(\lambda_i\text{,}\) then \(D_{i,i} = \lambda_i\text{.}\)
Question 4.44.
Can you diagonalize \(A=\begin{bmatrix} -1\amp 2\\-2\amp 4 \end{bmatrix}\text{?}\) If so, give a basis of eigenvectors, give corresponding choices for \(Q\text{,}\) \(Q^{-1}\text{,}\) and \(D\text{,}\) then use these to demonstrate how \(A=QDQ^{-1}\text{.}\)
Question 4.45.
Can you diagonalize \(A=\begin{bmatrix} 1\amp -1\\1\amp 1 \end{bmatrix}\text{?}\) If so, give a basis of eigenvectors, give corresponding choices for \(Q\text{,}\) \(Q^{-1}\text{,}\) and \(D\text{,}\) then use these to demonstrate how \(A=QDQ^{-1}\text{.}\)
Lemma 4.36.
If \(\vec{v_1}\) is an eigenvector with eigenvalue \(\lambda_1\) and \(\vec{v_2}\) is an eigenvector with eigenvalue \(\lambda_2 \neq \lambda_1\text{,}\) then \(\{ \vec{v_1},\vec{v_2} \}\) is linearly independent.
The following theorem relies on the preceding lemma and the fact that the dimension of every eigenspace is at least 1.
Theorem 4.37.
If a \(n\) by \(n\) matrix \(A\) has \(n\) distinct eigenvalues, then \(A\) is diagonalizable.
Question 4.46.
The converse of this theorem is not true in that there diagonalizable matrices that do not have distinct eigenvalues. Give an example of a matrix that is diagonalizable but does not have distinct eigenvalues. Remember that diagonal matrices are diagonalizable.
Theorem 4.38.
A \(n\) by \(n\) matrix \(A\) is diagonalizable if and only if the sums of the dimensions of its eigenspaces is \(n\text{.}\)
Question 4.47.
Give an example of a matrix (with real eigenvalues) that is not diagonalizable. Justify your claim.
Question 4.48.
Let \(A\) be a \(4\) by \(4\) matrix.
(a)
How many eigenvalues can \(A\) have?
(b)
For each of the possible number of eigenvalues in the previous part, write out all of the possible dimensions of each of the eigenspaces. For instance: if \(A\) has 4 distinct eigenvalues, then the only possibility is that each eigenspace has dimension 1 (why is that?).
(c)
Which of the cases from the previous problem correspond to \(A\) being diagonalizable?
Question 4.49.
Let \(A=\begin{bmatrix} 7 \amp -5 \amp 25\\ 10 \amp -8 \amp
35\\ 0 \amp 0 \amp -1\end{bmatrix}\text{.}\) Diagonalize \(A\) and use your diagonalization to compute \(A^{10}\text{.}\)