By the end of this section, you should be able to give precise and thorough answers to the questions listed below. You may want to keep these questions in mind to focus your thoughts as you complete the section.
What is an eigenspace of a matrix?
How do we find a basis for an eigenspace of a matrix?
What is true about any set of eigenvectors for a matrix that correspond to different eigenvalues?
The study of population dynamics β how and why people move from one place to another β is important to economists. The movement of people corresponds to the movement of money, and money makes the economy go. As an example, we might consider a simple model of population migration to and from the state of Michigan.
According to the Michigan Department of Technology, Management, and Budget, from 2011 to 2012, approximately 0.05% of the U.S. population outside of Michigan moved to the state of Michigan, while approximately 2% of Michigan's population moved out of Michigan. A reasonable question to ask about this situation is, if these numbers don't change, what is the long-term distribution of the US population inside and outside of Michigan (under the assumption that the total US population doesn't change.). The answer to this question involves eigenvalues and eigenvectors of a matrix. More details can be found later in this section.
Consider the matrix transformation from to defined by , where
.
We are interested in understanding what this matrix transformation does to vectors in . The matrix has eigenvalues and with corresponding eigenvectors and .
Equation (14.1) illustrates that it would be convenient to view the action of in the coordinate system where Span serves as the -axis and Span as the -axis. In this case, we can visualize that when we apply the transformation to a vector in the result is an output vector is scaled by a factor of in the direction and by a factor of in the direction. For example, consider the box with vertices at ,,, and as shown at left in Figure 14.1. The transformation stretches this box by a factor of in the direction and a factor of in the direction as illustrated at right in Figure 14.1. In this situation, the eigenvalues and eigenvectors provide the most convenient perspective through which to visualize the action of the transformation . Here, Span and Span are the eigenspaces of the matrix .
This geometric perspective illustrates how each the span of each eigenvalue of tells us something important about . In this section we explore the idea of eigenvalues and spaces defined by eigenvectors in more detail.
In other words, the eigenvectors for with eigenvalue are the non-zero vectors in Nul . Recall that the null space of an matrix is a subspace of . In Preview Activity 14.1 we say how these subspaces provided a convenient coordinate system through which to view a matrix transformation. These special null spaces are called eigenspaces.
If we know an eigenvalue of an matrix ,Activity 14.2 shows us how to find a basis for the corresponding eigenspace β just row reduce to find a basis for Nul . To this point we have always been given eigenvalues for our matrices, and have not seen how to find these eigenvalues. That process will come a bit later. For now, we just want to become more familiar with eigenvalues and eigenvectors. The next activity should help connect eigenvalues to ideas we have discussed earlier.
An important question we will want to answer about a matrix is how many linearly independent eigenvectors the matrix has. Activity 14.2 shows that eigenvectors for the same eigenvalue may be linearly dependent or independent, but all of our examples so far seem to indicate that eigenvectors corresponding to different eigenvalues are linearly independent. This turns out to be universally true as our next theorem demonstrates. The next activity should help prepare us for the proof of this theorem
Let and be distinct eigenvalues of a matrix with corresponding eigenvectors and . The goal of this activity is to demonstrate that and are linearly independent. To prove that and are linearly independent, suppose that
Let ,,, be distinct eigenvalues for a matrix and for each between 1 and let be an eigenvector of with eigenvalue . Then the vectors ,,, are linearly independent.
Let be a matrix with distinct eigenvalues ,,, and corresponding eigenvectors ,,,. To understand why ,,, are linearly independent, we will argue by contradiction and suppose that the vectors ,,, are linearly dependent. Note that cannot be the zero vector (why?), so the set is linearly independent. If we include into this set, the set may be linearly independent or dependent. If is linearly independent, then the set may be linearly independent or dependent. We can continue adding additional vectors until we reach the set which we are assuming is linearly dependent. So there must be a smallest integer such that the set is linearly dependent while is linearly independent. Since is linearly dependent, there is a linear combination of ,,, with weights not all 0 that is the zero vector. Let ,,, be such weights, not all zero, so that
(14.6)
If we multiply both sides of (14.6) on the left by the matrix we obtain
.(14.7)
If we multiply both sides of (14.6) by we obtain the equation
.(14.8)
Subtracting corresponding sides of equation (14.8) from (14.7) gives us
.(14.9)
Recall that is a linearly independent set, so the only way a linear combination of vectors in can be is if all of the weights are 0. Therefore, we must have
.
Since the eigenvalues are all distinct, this can only happen if
.
But equation (14.6) then implies that and so all of the weights ,,, are 0. However, when we assumed that the eigenvectors ,,, were linearly dependent, this led to having at least one of the weights ,,, be nonzero. This cannot happen, so our assumption that the eigenvectors ,,, were linearly dependent must be false and we conclude that the eigenvectors ,,, are linearly independent.
Show that is an eigenvalue for and find a basis for the corresponding eigenspace of .
Solution.
Recall that is an eigenvalue of if is not invertible. To show that is an eigenvalue for we row reduce the matrix
to . Since the third column of is not a pivot column, the matrix is not invertible. We conclude that is an eigenvalue of . The eigenspace of for the eigenvalue is Nul . The reduced row echelon form of shows that if and , then is free, , and . Thus,
.
Therefore, is a basis for the eigenspace of corresponding to the eigenvalue .
Geometrically describe the eigenspace of corresponding to the eigenvalue . Explain what the transformation does to this eigenspace.
Solution.
Since the eigenspace of corresponding to the eigenvalue is the span of a single nonzero vector , this eigenspace is the line in through the origin and the point . Any vector in this eigenspace has the form for some scalar . Notice that
,
so expands any vector in this eigenspace by a factor of 4.
Show that is an eigenvalue for and find a basis for the corresponding eigenspace of .
Solution.
To show that is an eigenvalue for we row reduce the matrix
to . Since the third column of is not a pivot column, the matrix is not invertible. We conclude that is an eigenvalue of . The eigenspace of for the eigenvalue is Nul . The reduced row echelon form of shows that if and , then and are free, and . Thus,
.
Therefore, is a basis for the eigenspace of corresponding to the eigenvalue .
Geometrically describe the eigenspace of corresponding to the eigenvalue . Explain what the transformation does to this eigenspace.
Solution.
Since the eigenspace of corresponding to the eigenvalue is the span of two linearly independent vectors and , this eigenspace is the plane in through the origin and the points and . Any vector in this eigenspace has the form for some scalars and . Notice that
Suppose that is an eigenvalue of with eigenvector for some integer . Show then that is an eigenvalue of with eigenvector . This argument shows that is an eigenvalue of with eigenvector for any positive integer .
Solution.
Let be an matrix with eigenvalue and corresponding eigenvector .
Let . Show that . (A square matrix is nilpotent ) if for some positive integer , so is an example of a nilpotent matrix.) What are the eigenvalues of ? Explain.
Solution.
Now we investigate a special type of matrix.
Straightforward calculations show that . Since is an upper triangular matrix, the eigenvalues of are the entries on the diagonal. That is, the only eigenvalue of is .
Show that the only eigenvalue of a nilpotent matrix is .
Solution.
Assume that is a nilpotent matrix. Suppose that is an eigenvalue of with corresponding eigenvector . Since is a nilpotent matrix, there is a positive integer such that . But is an eigenvalue of with eigenvector . The only eigenvalue of the zero matrix is , so . This implies that . We conclude that the only eigenvalue of a nilpotent matrix is .
To provide an alternative explanation to the result in the previous part, let be an eigenvector of corresponding to . Consider the matrix transformation corresponding to and corresponding to . Considering what happens to if and then are applied, describe why this justifies is also an eigenvector of .
As introduced earlier, data from the Michigan Department of Technology, Management, and Budget shows that from 2011 to 2012, approximately 0.05% of the U.S. population outside of Michigan moved to the state of Michigan, while approximately 2% of Michigan's population moved out of Michigan. We are interested in determining the long-term distribution of population in Michigan.
Let be the vector where is the population of Michigan and is the U.S. population outside of Michigan in year . Assume that we start our analysis at generation 0 and .
This example illustrates the general nature of what is called a Markov process (see Definition 9.4). Recall that the matrix that provides the link from one generation to the next is called the transition matrix.
Consider again the transition matrix from Project Activity 14.5. Recall that the solutions to equation (14.10) are all the vectors in Nul . In other words, the eigenvectors of for this eigenvalue are the nonzero vectors in Nul . Find a basis for the eigenspace of corresponding to this eigenvalue. Use whatever technology is appropriate.
Once we know a basis for the eigenspace of the transition matrix , we can use it to estimate the steady-state population of Michigan (assuming the stated migration trends are valid long-term). According to the US Census Bureauβ29β, the resident US population on December 1, 2019 was 330,073,471. Assuming no population growth in the U.S., what would the long-term population of Michigan be? How realistic do you think this is?