search
Search
Unlock 100+ guides
search toc
close
account_circle
Profile
exit_to_app
Sign out
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview Doc Search Code Search Beta SORRY NOTHING FOUND!
mic
Start speaking... Voice search is only supported in Safari and Chrome.
Shrink
Navigate to

# Comprehensive Guide on Algebraic and Geometric Multiplicity

schedule May 20, 2023
Last updated
local_offer
Linear Algebra
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!
Definition.

# Algebraic multiplicity

The algebraic multiplicity of an eigenvalue $\lambda$ is the number of times $\lambda$ appears as a root of the characteristic polynomiallink. We sometimes denote the algebraic multiplicity of $\lambda$ as $\mathrm{AM}(\lambda)$.

Example.

## Finding the algebraic multiplicity (1)

Consider the following matrix:

$$\boldsymbol{A}= \begin{pmatrix} 5&2\\0&5 \end{pmatrix}$$

Find the algebraic multiplicity of every eigenvalue of $\boldsymbol{A}$.

Solution. The characteristic polynomial of $\boldsymbol{A}$ is:

\begin{align*} p_\boldsymbol{A}(\lambda)&= \det(\boldsymbol{A}-\lambda\boldsymbol{I}_2)\\ &= \begin{vmatrix} 5-\lambda&2\\0&5-\lambda \end{vmatrix}\\ &=(5-\lambda)(5-\lambda) \end{align*}

The eigenvalue $\lambda=5$ repeats twice so the algebraic multiplicity of $\lambda$ is $2$.

Example.

## Finding the algebraic multiplicity (2)

Consider the following matrix:

$$\boldsymbol{A}= \begin{pmatrix} 3&1\\0&4 \end{pmatrix}$$

Find the algebraic multiplicity of every eigenvalue of $\boldsymbol{A}$.

Solution. The characteristic polynomial of $\boldsymbol{A}$ is:

\begin{align*} p_\boldsymbol{A}(\lambda)&= \det(\boldsymbol{A}-\lambda\boldsymbol{I}_2)\\ &= \begin{vmatrix} 3-\lambda&1\\0&4-\lambda \end{vmatrix}\\ &=(3-\lambda)(4-\lambda) \end{align*}

The eigenvalues of $\boldsymbol{A}$ are $\lambda_1=3$ and $\lambda_2=4$, and they both occur once. Therefore, the algebraic multiplicity of $\lambda_1$ is $1$ and that of $\lambda_2$ is also $1$.

Definition.

# Geometric multiplicity

Let $\boldsymbol{A}$ be a square matrix. The geometric multiplicity of an eigenvalue $\lambda$ of $\boldsymbol{A}$ is the dimension of the eigenspace of $\lambda$. In other words, the geometric multiplicity is the nullitylink of $\boldsymbol{A}-\lambda\boldsymbol{I}$. We sometimes denote the geometric multiplicity of $\lambda$ as $\mathrm{GM}(\lambda)$.

Example.

## Finding the geometric multiplicity

Consider the following matrix:

$$\boldsymbol{A}= \begin{pmatrix} 3&2\\0&4 \end{pmatrix}$$

Find the geometric multiplicity of every eigenvalue of $\boldsymbol{A}$.

Solution. The characteristic polynomial of $\boldsymbol{A}$ is:

\begin{align*} p_\boldsymbol{A}(\lambda)&= \det(\boldsymbol{A}-\lambda\boldsymbol{I}_2)\\ &= \begin{vmatrix} 3-\lambda&2\\0&4-\lambda \end{vmatrix}\\ &=(3-\lambda)(4-\lambda) \end{align*}

Therefore, the eigenvalues of $\boldsymbol{A}$ are $\lambda_1=3$ and $\lambda_2=4$. Let's now find the geometric multiplicity of $\lambda_1$ and $\lambda_2$.

The null spacelink of $\boldsymbol{A}-\lambda_1\boldsymbol{I}$ is the set of vectors $\boldsymbol{x}$ such that $(\boldsymbol{A}-\lambda_1\boldsymbol{I})\boldsymbol{x}=\boldsymbol{0}$. In matrix form, this is:

\begin{align*} \begin{pmatrix}3-\lambda_1&2\\0&4-\lambda_1\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix}\\ \begin{pmatrix}3-3&2\\0&4-3\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix}\\ \begin{pmatrix}0&2\\0&1\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix}\\ \begin{pmatrix}0&1\\0&0\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix} \end{align*}

We have that $x_1$ is a free variablelink and $x_2$ is a basic variablelink. Let $x_1=t$ where $t$ is some scalar. The solution of the homogeneous system is:

$$\begin{pmatrix} x_1\\x_2 \end{pmatrix}= \begin{pmatrix} t\\0 \end{pmatrix}= \begin{pmatrix} 1\\0 \end{pmatrix}t$$

Therefore, the basislink for the null space of $\boldsymbol{A}-\lambda_1\boldsymbol{I}$ is:

$$\left\{\begin{pmatrix}1\\0\end{pmatrix}\right\}$$

The nullity is the dimensionlink of the null space of $\boldsymbol{A}-\lambda_1\boldsymbol{I}$. In other words, the nullity is equal to the number of basis vectors for the null space of $\boldsymbol{A}-\lambda_1\boldsymbol{I}$. Since the basis of the null space consists of a single vector, we have that the nullity of $\boldsymbol{A}-\lambda_1\boldsymbol{I}$ is $1$. Therefore, the geometric multiplicity of $\lambda_1$ is $1$.

Similarly, let's now find the geometric multiplicity of $\lambda_2=4$. The matrix form of $\boldsymbol{A}-\lambda_2\boldsymbol{I}$ is:

\begin{align*} \begin{pmatrix}3-\lambda_2&2\\0&4-\lambda_2\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix}\\ \begin{pmatrix}-1&2\\0&0\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix} \end{align*}

Here, $x_1$ is a basic variable while $x_2$ is a free variable. Therefore, we let $x_2=t$ where $t\in\mathbb{R}$. The first row gives us:

\begin{align*} -x_1+2t&=0\\ x_1&=2t \end{align*}

Therefore, the solution of the homogenous system is:

$$\begin{pmatrix} x_1\\x_2 \end{pmatrix}= \begin{pmatrix} 2t\\t \end{pmatrix}= \begin{pmatrix} 2\\1 \end{pmatrix}t$$

Therefore, the basis for the null space of $\boldsymbol{A}-\lambda_2\boldsymbol{I}$ is:

$$\left\{\begin{pmatrix}2\\1\end{pmatrix}\right\}$$

Because the nullity of $\boldsymbol{A}-\lambda_2\boldsymbol{I}$ is $1$, the geometric multiplicity of $\lambda_2$ is $1$.

Example.

# Finding algebraic and geometric multiplicity

Consider the following matrix:

$$\boldsymbol{A}= \begin{pmatrix} 5&0\\0&5 \end{pmatrix}$$

Find the algebraic and geometric multiplicity of every eigenvalue of $\boldsymbol{A}$.

Solution. The characteristic polynomial of $\boldsymbol{A}$ is:

\begin{align*} p_\boldsymbol{A}(\lambda)&= \det(\boldsymbol{A}-\lambda\boldsymbol{I})\\ &=\begin{vmatrix} 5-\lambda&0\\0&5-\lambda \end{vmatrix}\\ &=(5-\lambda)(5-\lambda)\\ \end{align*}

The eigenvalue of $\boldsymbol{A}$ is $\lambda=5$. Since $\lambda=5$ repeats twice, we have that the algebraic multiplicity of $\boldsymbol{A}$ is $2$. Let's now find the geometric multiplicity of $\lambda$. The matrix form of $\boldsymbol{A}-\lambda\boldsymbol{I}$ is:

\begin{align*} \begin{pmatrix}5-\lambda&0\\0&5-\lambda\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix}\\ \begin{pmatrix}5-5&0\\0&5-5\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix}\\ \begin{pmatrix}0&0\\0&0\end{pmatrix} \begin{pmatrix}x_1\\x_2\end{pmatrix}&= \begin{pmatrix}0\\0\end{pmatrix} \end{align*}

We have that $x_1$ and $x_2$ are both free variables. Let $x_1=r$ and $x_2=t$ where $r$ and $t$ are some scalars. The solution can be expressed as:

$$\boldsymbol{x}= \begin{pmatrix} r\\t \end{pmatrix}= \begin{pmatrix} 1\\0 \end{pmatrix}r+ \begin{pmatrix} 0\\1 \end{pmatrix}t$$

The eigenspace $\mathcal{E}$ of $\lambda=5$ is:

$$\mathcal{E}=\left\{ \begin{pmatrix}1\\0 \end{pmatrix}r+ \begin{pmatrix}0\\1 \end{pmatrix}t \;|\;r,t\in\mathbb{R}\right\}$$

The basis $\mathcal{B}$ of $\mathcal{E}$ is:

$$\mathcal{B}=\left\{ \begin{pmatrix}1\\0 \end{pmatrix},\;\; \begin{pmatrix}0\\1 \end{pmatrix}\right\}$$

Since $\mathcal{B}$ has $2$ basis vectors, the geometric multiplicity of $\lambda$ is $2$. Let's summarize the results below:

Eigenvalue

Algebraic multiplicity

Geometric multiplicity

$\lambda=5$

$2$

$2$

We will later prove that the algebraic multiplicity is always at least as large as the geometric multiplicity.

Theorem.

# Lower bound of the geometric multiplicity of an eigenvalue

Let $\boldsymbol{A}$ be an $n\times{n}$ matrix and $\lambda$ be an eigenvalue of $\boldsymbol{A}$. The geometric multiplicity of $\lambda$ is equal to or greater than $1$.

$$\det(\boldsymbol{A}-\lambda\boldsymbol{I})=0$$

By theoremlink, this means that the matrix $\boldsymbol{A}-\lambda\boldsymbol{I}$ is not invertiblelink. By theoremlink, we have that the homogeneous system $(\boldsymbol{A}-\lambda\boldsymbol{I})\boldsymbol{x}=\boldsymbol{0}$ has at least one non-trivial solution $\boldsymbol{x}$. This means that the null space of $\boldsymbol{A}-\lambda\boldsymbol{I}$ contains at least one eigenvector for each eigenvalue $\lambda$.

By theoremlink, we know that scalar multiples of the eigenvector are also eigenvectors. Therefore, the eigenspace contains a set of all linear combinations of at least one eigenvector. We conclude that the eigenspace of every eigenvalue must have a dimension of at least $1$.

This completes the proof.

Theorem.

# Upper bound of the algebraic multiplicity of an eigenvalue

Let $\boldsymbol{A}$ be an $n\times{n}$ matrix and $\lambda$ be an eigenvalue of $\boldsymbol{A}$. The algebraic multiplicity of $\lambda$ is a most $n$.

Proof. Let $\boldsymbol{A}$ be an $n\times{n}$ matrix. By theoremlink, we know that the degree of the characteristic polynomial of $\boldsymbol{A}$ is $n$. This means that an eigenvalue can occur as a root at most $n$ times. In other words, the algebraic multiplicity of an eigenvalue is at most $n$. This completes the proof.

Theorem.

# Sum of algebraic multiplicities is equal to the size of the associated matrix

Let $\boldsymbol{A}$ be an $n\times{n}$ matrix. If $\lambda_1$, $\lambda_2$, $\cdots$, $\lambda_k$ are the eigenvalues of $\boldsymbol{A}$, then the sum of the corresponding algebraic multiplicities is equal to $n$, that is:

$$\mathrm{AM}(\lambda_1)+ \mathrm{AM}(\lambda_2)+ \cdots+ \mathrm{AM}(\lambda_k)= n$$

Proof. By theoremlink, if $\boldsymbol{A}$ is an $n\times{n}$ matrix, then the characteristic polynomial of $\boldsymbol{A}$ will have degree $n$. A polynomial with degree $n$ has exactly $n$ roots (including duplicates). Since the roots of the characteristic polynomial correspond to the eigenvalues of $\boldsymbol{A}$, we conclude that the sum of the algebraic multiplicities of the eigenvalues must be equal to $n$. This completes the proof.

Theorem.

# Sum of geometric multiplicities of a square matrix with linearly independent columns is greater than its size

Let $\boldsymbol{A}$ be an $n\times{n}$ matrix with $n$ linearly independent eigenvectors. If $\lambda_1$, $\lambda_2$, $\cdots$, $\lambda_k$ are the eigenvalues of $\boldsymbol{A}$, then the sum of the corresponding geometric multiplicities is equal to or greater than $n$, that is:

$$\mathrm{GM}(\lambda_1)+ \mathrm{GM}(\lambda_2)+ \cdots+ \mathrm{GM}(\lambda_k)\ge n$$

Note that this is only an intermediate result and we will later prove that the $\ge$ can be replaced with an equality.

Proof. Suppose $\boldsymbol{A}$ has the following $n$ linearly independent eigenvectors:

$$\begin{equation}\label{eq:lwwwN9VKQ6EzRnoSxiM} \{ \boldsymbol{v}_1, \boldsymbol{v}_2, \boldsymbol{v}_3, \cdots, \boldsymbol{v}_n \} \end{equation}$$

However, some of these eigenvectors may be associated with the same eigenvalue. For instance, suppose $\boldsymbol{v}_1$ and $\boldsymbol{v}_2$ are associated with the same eigenvalue $\lambda=4$. Since $\boldsymbol{v}_1$ and $\boldsymbol{v}_2$ are linearly independent, the number of basis vectors for the eigenspacelink of $\lambda=4$ must be at least $2$. The reason we say "at least" here is that the number of basis vectors for $\lambda=4$ may potentially be greater than $2$. For instance, consider the following case:

$$\boldsymbol{v}_1= \begin{pmatrix}1\\0\\0\end{pmatrix},\;\;\;\;\; \boldsymbol{v}_2= \begin{pmatrix}0\\1\\0\end{pmatrix}$$

The eigenspace for the eigenvalue may be spanned by the following basis vectors:

$$\mathcal{B}_1=\left\{\begin{pmatrix}1\\0\\0\end{pmatrix}, \begin{pmatrix}0\\1\\0\end{pmatrix}, \begin{pmatrix}0\\0\\1\end{pmatrix}\right\}$$

Here, because there are $3$ basis vectors, the corresponding geometric multiplicity is $3$. Note that we will later show that such a case is impossible.

By repeatedly applying this logic to the rest of the eigenvectors in \eqref{eq:lwwwN9VKQ6EzRnoSxiM}, we conclude that the sum of geometric multiplicities is greater than or equal to $n$. This completes the proof.

Theorem.

# Algebraic multiplicity is greater than or equal to the geometric multiplicity

Let $\boldsymbol{A}$ be a square matrix and $\lambda$ be an eigenvalue of $\boldsymbol{A}$. The algebraic multiplicity of $\lambda$ is greater than or equal to the geometric multiplicity of $\lambda$, that is:

$$\mathrm{AM}(\lambda)\ge \mathrm{GM}(\lambda)$$

Proof. Suppose we have an $n\times{n}$ matrix $\boldsymbol{A}$ and an invertible $n\times{n}$ matrix $\boldsymbol{P}$. By theoremlink, we know that similarlink matrices have the same characteristic polynomial:

$$\begin{equation}\label{eq:IHhrbeAVsRoFdBeXaWV} \det(\boldsymbol{A}-\lambda\boldsymbol{I}) = \det(\boldsymbol{P}^{-1}\boldsymbol{AP}-\lambda\boldsymbol{I}) \end{equation}$$

Our goal is to derive an expression for the right-hand side of \eqref{eq:IHhrbeAVsRoFdBeXaWV} that includes a term representing the geometric multiplicity of an eigenvalue $\lambda_i$.

To simplify notations, let $k$ denote the geometric multiplicity of eigenvalue $\lambda_i$ and let $\{\boldsymbol{x}_1,\boldsymbol{x}_2, \cdots,\boldsymbol{x}_k\}$ be the basis for the corresponding eigenspace $\mathcal{E}_i$. We define an $n\times{n}$ invertible matrix $\boldsymbol{P}$ whose first $k$ columns are these basis vectors:

$$\boldsymbol{P}= \begin{pmatrix} \vert&\vert&\cdots&\vert&\vert&\cdots&\vert\\ \boldsymbol{x}_1&\boldsymbol{x}_2& \cdots&\boldsymbol{x}_k&\boldsymbol{x}_{k+1} &\cdots&\boldsymbol{x}_{n}\\ \vert&\vert&\cdots&\vert&\vert&\cdots&\vert\\ \end{pmatrix}$$

Note that there exists such an invertible matrix $\boldsymbol{P}$ because the first $k$ columns are linearly independent by definition of basis vectors. We can ensure $\boldsymbol{P}$ is invertible by coming up with a set of vectors $x_{k+1}$, $x_{k+2}$, $\cdots$, $x_{n}$ that is also linearly independent. By theoremlink, an $n\times{n}$ matrix with $n$ linearly independent columns is invertible. This guarantees the existence of an invertible matrix $\boldsymbol{P}$.

We now express $\boldsymbol{P}$ as a block matrixlink:

$$\boldsymbol{P}= \begin{pmatrix} \boldsymbol{B}&\boldsymbol{C} \end{pmatrix}$$

Where:

• $\boldsymbol{B}$ is an $n\times{k}$ sub-matrix whose columns are filled with the eigenbasis vectors.

• $\boldsymbol{C}$ is an $n\times(n-k)$ sub-matrix whose columns are $\boldsymbol{x}_{k+1}$, $\boldsymbol{x}_{k+2}$, $\cdots$, $\boldsymbol{x}_{n}$.

Similarly, let's represent the inverse matrix $\boldsymbol{P}^{-1}$ using block format:

$$\boldsymbol{P}^{-1}= \begin{pmatrix} \boldsymbol{D}\\\boldsymbol{E} \end{pmatrix}$$

Where:

• $\boldsymbol{D}$ is a $k\times{n}$ sub-matrix.

• $\boldsymbol{E}$ is a $(n-k)\times{n}$ sub-matrix.

By theoremlink, the matrix product $\boldsymbol{P}^{-1}\boldsymbol{P}$ can be expressed as:

$$\boldsymbol{P}^{-1}\boldsymbol{P} = \begin{pmatrix} \boldsymbol{DB}&\boldsymbol{DC}\\ \boldsymbol{EB}&\boldsymbol{EC} \end{pmatrix}$$

Where:

• $\boldsymbol{DB}$ is an $k\times{k}$ sub-matrix.

• $\boldsymbol{DC}$ is an $k\times(n-k)$ sub-matrix

• $\boldsymbol{EB}$ is an $(n-k)\times{k}$ sub-matrix.

• $\boldsymbol{EC}$ is an $(n-k)\times(n-k)$ sub-matrix.

By definitionlink, a matrix multiplied by its inverse matrix is equal to the identity matrix, that is, $\boldsymbol{P}^{-1}\boldsymbol{P}=\boldsymbol{I}_n$. Therefore:

• $\boldsymbol{DB}=\boldsymbol{I}_k$.

• $\boldsymbol{EC}=\boldsymbol{I}_{n-k}$.

• $\boldsymbol{DC}$ is an $k\times(n-k)$ sub-matrix whose entries are all zeroes.

• $\boldsymbol{EB}$ is an $(n-k)\times{k}$ sub-matrix whose entries are all zeroes.

Now, consider $\boldsymbol{P}^{-1}\boldsymbol{AP}$. By theoremlink and theoremlink, $\boldsymbol{P}^{-1}\boldsymbol{AP}$ can be expressed as:

\begin{equation}\label{eq:jBeRUvHtoFRDerjz6Sl} \begin{aligned}[b] \boldsymbol{P}^{-1} \boldsymbol{A}\boldsymbol{P}&= \begin{pmatrix}\boldsymbol{D}\\\boldsymbol{E}\end{pmatrix}\boldsymbol{A} \begin{pmatrix}\boldsymbol{B}&\boldsymbol{C}\end{pmatrix}\\ &= \begin{pmatrix}\boldsymbol{DA}\\\boldsymbol{EA}\end{pmatrix} \begin{pmatrix}\boldsymbol{B}&\boldsymbol{C}\end{pmatrix}\\ &= \begin{pmatrix}\boldsymbol{DAB}&\boldsymbol{DAC}\\ \boldsymbol{EAB}&\boldsymbol{EAC}\end{pmatrix} \end{aligned} \end{equation}

Because $\lambda_i$ is an eigenvalue of $\boldsymbol{A}$ and $\boldsymbol{x}_i$ is a corresponding eigenvector, the following equation holds true by definition:

\begin{align*} \boldsymbol{A}\boldsymbol{x}_i &=\lambda_i\boldsymbol{x}_i \end{align*}

We now multiply both sides by $\boldsymbol{D}$. Note that this is possible because:

• $\boldsymbol{D}$ has shape $k\times{n}$.

• $\boldsymbol{A}$ has shape $n\times{n}$.

• $\boldsymbol{x}_i$ has shape $n\times{1}$.

Some algebraic manipulation yields:

\begin{align*} \boldsymbol{DA}\boldsymbol{x}_i &=\boldsymbol{D}\lambda_i\boldsymbol{x}_i\\ \boldsymbol{DA}\boldsymbol{x}_i &=\lambda_i\boldsymbol{D}\boldsymbol{x}_i\\ \boldsymbol{DA}\boldsymbol{x}_i-\lambda_i\boldsymbol{D} \boldsymbol{x}_i &=\boldsymbol{0}\\ (\boldsymbol{DA}-\lambda_i\boldsymbol{D})\boldsymbol{x}_i &=\boldsymbol{0} \end{align*}

This means that:

\begin{align*} \boldsymbol{DA}-\lambda_i\boldsymbol{D}&=\boldsymbol{0}\\ \boldsymbol{DA}&=\lambda_i\boldsymbol{D}\\ \end{align*}

Similarly, we have that $\boldsymbol{EA}=\lambda_i\boldsymbol{E}$. Now we go back to \eqref{eq:jBeRUvHtoFRDerjz6Sl} and perform the following substitutions:

\begin{align*} \boldsymbol{P}^{-1} \boldsymbol{A}\boldsymbol{P} &= \begin{pmatrix}\boldsymbol{DAB}&\boldsymbol{DAC}\\ \boldsymbol{EAB}&\boldsymbol{EAC}\end{pmatrix}\\ &= \begin{pmatrix}\lambda_i\boldsymbol{DB}&\boldsymbol{DAC}\\ \lambda_i\boldsymbol{EB}&\boldsymbol{EAC}\end{pmatrix}\\ &= \begin{pmatrix}\lambda_i\boldsymbol{I}_k&\boldsymbol{DAC}\\ \boldsymbol{O}&\boldsymbol{EAC}\end{pmatrix} \end{align*}

Subtracting both sides by $\lambda\boldsymbol{I}$ where $\lambda$ is any eigenvalue of $\boldsymbol{A}$ yields:

\begin{align*} \boldsymbol{P}^{-1} \boldsymbol{A}\boldsymbol{P}-\lambda\boldsymbol{I} &= \begin{pmatrix}\lambda_i\boldsymbol{I}_k -\lambda\boldsymbol{I}_k &\boldsymbol{DAC}\\ \boldsymbol{O}&\boldsymbol{EAC}-\boldsymbol{I}_{n-k} \end{pmatrix}\\ &= \begin{pmatrix}(\lambda_i -\lambda)\boldsymbol{I}_k &\boldsymbol{DAC}\\ \boldsymbol{O}&\boldsymbol{EAC}-\boldsymbol{I}_{n-k} \end{pmatrix} \end{align*}

We now take the determinant of both sides. By theoremlink, we have that:

$$\det(\boldsymbol{P}^{-1}\boldsymbol{AP}-\lambda\boldsymbol{I})= \det\big((\lambda_i-\lambda)\boldsymbol{I}_k\big) \cdot\det\big(\boldsymbol{EAC}-\boldsymbol{I}_{n-k}\big)$$

$$\det(\boldsymbol{P}^{-1}\boldsymbol{AP}- \lambda\boldsymbol{I})= (\lambda_i-\lambda)^k \cdot\det\big(\boldsymbol{EAC}-\boldsymbol{I}_{n-k}\big)$$

We substitute this into \eqref{eq:IHhrbeAVsRoFdBeXaWV} to finally get:

$$\det(\boldsymbol{A}-\lambda\boldsymbol{I}) = (\lambda_i-\lambda)^k \cdot\det\big(\boldsymbol{EAC}-\boldsymbol{I}_{n-k}\big)$$

Because $(\lambda_i-\lambda)$ appears $k$ times, the algebraic multiplicity of $\lambda_i$ is at least $k$. This completes the proof.

Theorem.

# Relationship between diagonalization, algebraic and geometric multiplicity

Let $\boldsymbol{A}$ be an $n\times{n}$ matrix. The following statements are equivalent:

1. $\boldsymbol{A}$ is a diagonalizable matrixlink.

2. the sum of the geometric multiplicities of the eigenvalues of $\boldsymbol{A}$ is equal to $n$.

3. the geometric multiplicity and the algebraic multiplicity of every eigenvalue are equal.

Proof. We will prove $(1)\implies(2)\implies(3)\implies(1)$. We start by assuming $(1)$ that the $n\times{n}$ matrix $\boldsymbol{A}$ is diagonalizable. By theoremlink, since $\boldsymbol{A}$ is diagonalizable, $\boldsymbol{A}$ has $n$ linearly independentlink eigenvectors $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_n$.

Suppose $\boldsymbol{A}$ has $t$ distinct eigenvalues and let the corresponding geometric multiplicities be $k_1$, $k_2$, $\cdots$, $k_t$ and the algebraic multiplicities be $m_1$, $m_2$, $\cdots$, $m_t$.

By theoremlink, the sum of the geometric multiplicities must be greater than or equal to $n$, that is:

$$\begin{equation}\label{eq:TQhsAouxJRmwjztMdpm} k_1+k_2+\cdots+k_t\ge{n} \end{equation}$$

By theoremlink, the sum of the algebraic multiplicities is equal to $n$, that is:

$$\begin{equation}\label{eq:x90dgy3rmNu24WhyvLP} n=m_1+m_2+\cdots+m_t \end{equation}$$

Substituting \eqref{eq:x90dgy3rmNu24WhyvLP} into \eqref{eq:TQhsAouxJRmwjztMdpm} gives:

$$\begin{equation}\label{eq:oo3zHU2Uji1nxu4Z8Dl} k_1+k_2+\cdots+k_t\ge {m_1+m_2+\cdots+m_t} \end{equation}$$

Now, from theoremlink, we have that the algebraic multiplicity of an eigenvalue is greater than or equal to the geometric multiplicity of the same eigenvalue, that is:

$$\begin{equation}\label{eq:UShoHmbQS4jJcIj4U30} \begin{gathered} m_1\ge{k_1}\\ m_2\ge{k_2}\\ \vdots\\ m_t\ge{k_t}\\ \end{gathered} \end{equation}$$

Summing all of these inequalities gives:

$$\begin{equation}\label{eq:SoPWDF64YY5gRAlrCBa} m_1+m_2+\cdots+m_t\ge{k_1+k_2+\cdots+k_t} \end{equation}$$

Combining \eqref{eq:oo3zHU2Uji1nxu4Z8Dl} and \eqref{eq:SoPWDF64YY5gRAlrCBa} gives:

$$k_1+k_2+\cdots+k_t\;\;{\color{blue}\ge}\;\; m_1+m_2+\cdots+m_t\;\;{\color{blue}\ge}\;\;{k_1+k_2+\cdots+k_t}$$

This means that:

$$k_1+k_2+\cdots+k_t = m_1+m_2+\cdots+m_t$$

Combining with \eqref{eq:x90dgy3rmNu24WhyvLP} gives:

$$n=k_1+k_2+\cdots+k_t = m_1+m_2+\cdots+m_t$$

This proves $(1)\implies(2)$.

* * *

We now prove $(2)\implies(3)$. Assume the sum of the geometric multiplicities is $n$. By theoremlink, the sum of the algebraic multiplicities is $n$, which means:

$$\begin{equation}\label{eq:vCICCrzPdLZnYtErI9X} n=k_1+k_2+\cdots+k_t = m_1+m_2+\cdots+m_t \end{equation}$$

Again, by theoremlink, we have that:

$$\begin{equation}\label{eq:DRURNqiWpvCY2V0roHb} \begin{gathered} m_1\ge{k_1}\\ m_2\ge{k_2}\\ \vdots\\ m_t\ge{k_t}\\ \end{gathered} \end{equation}$$

Combining \eqref{eq:vCICCrzPdLZnYtErI9X} and \eqref{eq:DRURNqiWpvCY2V0roHb} implies:

$$\begin{gather*} m_1=k_1\\ m_2=k_2\\ \vdots\\ m_t=k_t \end{gather*}$$

This means that for every eigenvalue $\lambda_i$ for $i=1,2,\cdots,t$, the corresponding geometric multiplicity and algebraic multiplicity are equal. This proves $(2)\implies(3)$.

* * *

We now prove $(3)\implies(1)$. Suppose $\lambda_1$, $\lambda_2$, $\cdots$, $\lambda_t$ are distinct eigenvalues of an $n\times{n}$ matrix $\boldsymbol{A}$ where $t\le{n}$. Denote the corresponding eigenspace as $\mathcal{E}_1$, $\mathcal{E}_2$, $\cdots$, $\mathcal{E}_t$ and the corresponding basis as $\mathcal{B}_1$, $\mathcal{B}_2$, $\cdots$, $\mathcal{B}_t$ respectively.

We assume $(3)$ that the geometric multiplicity and algebraic multiplicity for every eigenvalue are equal. By theoremlink, because the sum of the algebraic multiplicities is equal to $n$, the sum of geometric multiplicities is also equal to $n$.

By theoremlink, because the basis $\mathcal{B}_1$, $\mathcal{B}_2$, $\cdots$, $\mathcal{B}_t$ correspond to distinct eigenvalues $\lambda_1$, $\lambda_2$, $\cdots$, $\lambda_t$, the union of the basis $\mathcal{B}_1\cup \mathcal{B}_2\cup \cdots\cup \mathcal{B}_t$ is linearly independent. This union can be expressed as:

$$\mathcal{B}_1\cup{\mathcal{B}_2}\cup\cdots\cup \mathcal{B}_t= \{\boldsymbol{w}_1,\boldsymbol{w}_2,\cdots,\boldsymbol{w}_n\}$$

Where $\boldsymbol{w}_i$ for $i=1,2,\cdots,n$ is a basis vector. The reason why the union contains $n$ vectors is that the sum of the geometric multiplicities is equal to $n$, which means that the total number of basis vectors when combined into a single set must be $n$.

We've now established that $\boldsymbol{A}$ has $n$ linearly independent eigenvectors. By theoremlink, we know that an $n\times{n}$ matrix with $n$ linearly independent eigenvectors is diagonalizable. Therefore, $\boldsymbol{A}$ is diagonalizable.

This completes the proof.

Example.

## Showing that a matrix diagonalizable using algebraic and geometric multiplicity

Show that the following matrix is diagonalizable:

$$\boldsymbol{A}= \begin{pmatrix} 5&0\\0&5 \end{pmatrix}$$

Solution. We have shown in examplelink that:

Eigenvalue

Algebraic multiplicity

Geometric multiplicity

$\lambda=5$

$2$

$2$

By theorem, we can refer to either of the following fact to prove that $\boldsymbol{A}$ is diagonalizable:

• the geometric multiplicity of the eigenvalue is n.

• the algebraic multiplicity and the geometric multiplicity are equal.

thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!