Discussion. Recall that once we have obtained the eigenvalues of a square matrix , we solve the below equation for to obtain the corresponding eigenvectorslink:
By definitionlink, the set of vectors that satisfies such a system is the null space of . The eigenspace associated with is defined to be this null space.
Note that an eigenspace always contains the zero vector because is a solution to . However, an eigenvector cannot be the zero vector by definitionlink. In other words, all the vectors that reside in the eigenspace are eigenvectors of except for the zero vector.
■
Theorem.
Finding the eigenspace of a matrix
Consider the following matrix:
Find the eigenspace of .
Solution. The characteristic polynomiallink of is:
Therefore, the eigenvalues of are and . Next, let's find eigenvectors and corresponding to and respectively.
Taking the first row:
Let where is some scalar. This means that . Therefore, an eigenvector corresponding to eigenvalue can be expressed as:
The eigenspace associated with is the vector space below:
We can think of this eigenspace as a collection of eigenvectors corresponding to . Again, keep in mind that any vector in this eigenspace is an eigenvector associated with except the zero vector.
Next, let's find the eigenvectors corresponding to the eigenvalue . We proceed like so:
Let where . This means is:
The eigenspace corresponding to eigenvalue is:
■
Theorem.
Eigenspace is a subspace
Let be an matrix and let be an eigenvalue of . The eigenspace associated with is a subspacelink of .
Proof. By definitionlink, the eigenspace of an eigenvalue is:
By theorem, the null space of any matrix is a space of . This completes the proof.
■
Definition.
Eigenbasis
Let be an matrix. The eigenbasis of is a basislink of consisting of eigenvectors of . As we shall see later, not all square matrices have an eigenbasis.
Theorem.
Finding the eigenbasis of a matrix
Find the eigenbasis of the following matrix:
Solution. The characteristic polynomial of is:
Therefore, the eigenvalues of are and .
Let's find an eigenvector associated with like so:
Here, is a basic variablelink while is a free variablelink. Let where is some scalar. The solution can now be expressed as:
Therefore, the eigenspace associated with is:
The basis for is:
Next, let's find the eigenvectors corresponding to . We proceed like so:
Let where . The solution can be expressed as:
The eigenspace is:
The basis for is:
The eigenbasis for is the union of and that is:
Note that the combined basis is guaranteed to be linearly independent given that they correspond to distinct eigenvalues. We will prove this laterlink in this guide.
■
Example.
Showing that a matrix does not have an eigenbasis
Consider the following matrix:
Show that an eigenbasis does not exist for .
Solution. The characteristic polynomial of is:
The eigenvalue of is thus . The corresponding eigenvectors are obtained by:
Here, is a free variable and is a basic variable. Let where is some scalar. The solution set can now be expressed as:
Therefore, the eigenspace associated with is:
The basis for is:
Since the eigenspace is spanned by one basis vector, the dimensionlink of is . By definitionlink, the eigenbasis must be a basis for in this case. However, one basis vector cannot span , which means there is no eigenbasis for .
■
Theorem.
Union of the basis for eigenspace corresponding to distinct eigenvalues is linearly independent
If , , , are distinct eigenvalues of a matrix with corresponding eigenspace , , , spanned by basis , , , respectively, then is linearly independent.
Proof. Let , , , be distinct eigenvalues with corresponding eigenspace , , , spanned by basis , , , respectively. Suppose the union of the eigenvectors is:
Where . Note that may be greater than because the dimension of the eigenspace corresponding to an eigenvalue may be greater than .
Now, consider the criterialink for linear independence:
Where , , , are some scalar coefficients. Our goal is to show that all of these coefficients must equal zero for to hold - this will imply that is linearly independent by definitionlink. Let's group the eigenbasis vectors with the same eigenvalue:
Here, we've color-coded the eigenbasis vectors with the same eigenvalue - this is only an example to demonstrate what we mean by the grouping of eigenvectors. For instance, in this case, and are the eigenbasis vectors of corresponding to the first eigenvalue. Let's define the following vectors:
We can rewrite as:
For to hold, the following must be true:
To understand why, suppose these eigenvectors are not equal to the zero vector:
We can make in the subject to get:
This means that we can express as a linear combination of the other vectors , , , . In other words, is linearly dependent on the other vectors. Now, recall that all the eigenvectors corresponding to distinct eigenvalues are linearly independent by theoremlink. This means that , , , cannot be eigenvectors, which implies they must be zero vectors. This is why holds.
Gong back to , we have that:
Because basis vectors are linearly independent by definitionlink, the following is true:
Since all the coefficients must be zero for to hold, the union of the basis is linearly independent by definitionlink. This completes the proof.
■