# Introduction to Determinants

*chevron_right*

*schedule*Jan 20, 2023

*toc*Table of Contents

*expand_more*

**interactive map of data science**

The ultimate goal of this chapter is to derive the relationship between matrix invertibility and determinant. Unfortunately, we cannot dive straight into their proofs just yet - we will first need to derive a series of basic properties of determinants.

# Determinant of a 2x2 matrix

Consider the $2\times2$ matrix $\boldsymbol{A}$ below:

The determinant of $\boldsymbol{A}$ is defined as:

The determinant of $\boldsymbol{A}$ is sometimes also written as:

Note that there exists a more general definition of determinants that applies to a square matrix of any size. We will cover this later in this guide.

## Computing the determinant of a 2x2 matrix (1)

Compute the determinant of the following matrix:

Solution. The determinant of $\boldsymbol{A}$ is:

Later in the chapter, we will go over what it means for $\det(\boldsymbol{A})=2$. Specifically, we will cover:

the geometric interpretation behind determinants.

the relationship between determinant and invertibility.

## Computing the determinant of a 2x2 matrix (2)

Compute the following determinant:

Solution. The determinant is:

# Minor and cofactor of an entry

Suppose $\boldsymbol{A}$ is a square matrix. The minor $M_{ij}$ of an entry $a_{ij}$ is defined as the determinant of the matrix that remains after removing the row and column that contains $a_{ij}$. The cofactor $C_{ij}$ of an entry $a_{ij}$ is defined as $(-1)^{i+j}M_{ij}$.

## Computing the minor and cofactor

Consider the following matrix:

Compute the minor and cofactor of the green and red entries.

Solution. To compute the minor of the green entry, we first must ignore the row and column (colored in blue below) that holds this entry:

The determinant of the remaining sub-matrix is:

Therefore, the minor of the green entry is $6$. Since the green entry is located at the $1$st row $1$st column, its cofactor is:

Next, let's find the minor and cofactor of the red entry. We ignore the following values in blue:

The minor of the red entry is the determinant of the sub-matrix:

The cofactor of the red entry is:

Notice how the cofactor of an entry is identical to the minor of that entry except that their sign might be different depending on the position of the entry in the matrix. The following theorem is useful for keeping track of the sign.

# Checkerboard pattern of signs

The relationship between the signs of the cofactor and minor of an entry is described by the checkerboard pattern of signs shown below:

For instance, for the entry $a_{22}$, no sign flip occurs and hence its cofactor is equal to its minor. However, for the entry $a_{23}$, the cofactor and minor have opposite signs.

Proof. By definitionlink, the cofactor of the entry $a_{ij}$ is equal to the minor of $a_{ij}$ multiplied by $(-1)^{i+j}$. This means that for the top-left entry $a_{11}$, the associated sign is positive because $(-1)^{1+1}=1$. As we move along the row or column, the value alternates between positive and negative, which gives us the checkerboard pattern of signs. This completes the proof.

# Cofactor expansion along a row or column

Suppose we have an $n\times{n}$ matrix $\boldsymbol{A}$. The cofactor expansion along the $i$-th row is defined as:

Where $C_{i1}$ is the cofactor of the entry in the $i$-th row $1$st column.

The cofactor expansion along the $j$-th column is defined as:

## Performing the cofactor expansion along a row or column

Consider the following matrix:

Perform the following:

cofactor expansion along the $1$st row.

cofactor expansion along the $1$st column.

cofactor expansion along the $2$nd row.

Solution. The cofactor expansion along the $1$st row is:

Remember our checkerboard pattern of signs - this is why we see a negative for the second term!

The cofactor expansion along the $1$st column is:

The cofactor expansion along the $2$nd row is:

Notice how all of these cofactor expansions result in the same value! As we shall prove at the very end of this chapter, the cofactor expansion along any row or column is equal 🤯!

## General definition of determinants

If $\boldsymbol{A}$ is an $n\times{n}$ matrix, then the determinant of $\boldsymbol{A}$ is defined as the cofactor expansion along the first row:

Where:

$a_{1n}$ is the entry in the $1$st row $n$-th column of $\boldsymbol{A}$.

$C_{1n}$ is the cofactor of the entry $a_{1n}$.

As we have demonstrated in the previous example, the determinant can actually be computed using cofactor expansion along any row or column. Again, we will prove this later in the chapter!

## Computing the determinant of a 3x3 matrix

Compute the determinant of the following matrix:

Solution. The determinant of $\boldsymbol{A}$ is:

## Deriving the definition of 2x2 determinant

Suppose we have the following $2\times2$ matrix:

We have previously stated that the definition of the determinant of this $2\times2$ matrix is:

Let's derive this definition ourselves using the general definition of determinant. The determinant is defined as the cofactor expansion along the first row:

# Determinant is equal to the cofactor expansion along the first column

We have originally defined the determinant to be equal to the cofactor expansion along the first row. The determinant is also equal to the co-factor expansion along the first column.

Proof. We will prove this by induction. We first must show that the proposition holds for the $2\times2$ case.

The cofactor expansion along the first row is:

Remember, this is equal to $\det(\boldsymbol{A})$ as per the definition of determinant. The cofactor expansion along the first column is:

Therefore, the cofactor expansion along the first row and that along the first column are equal! This means that we can use the cofactor expansion along the first column to compute the determinant as well.

Now, in a typical proof by induction, we will assume that the proposition holds for the $n-1\times{n-1}$ case and show that the proposition holds also for the $n\times{n}$ case. However, one of the problems about working with the general case is that the notation becomes convoluted and obscures the essence of the proof - this is precisely why almost all introductory linear algebra books avoid this proof. The approach we will take here is to consider the simple $3\times3$ case but we will attempt to make general claims during our proof.

Consider the following $3\times3$ matrix:

The cofactor expansion along the first row and the cofactor expansion along the first column are:

Our goal is to show the following:

Notice how the first terms in \eqref{eq:nPNlDtMwbd4Y30AI0Uv} are equal. Therefore, let's ignore the first term - our goal now is to show the equivalence between the following:

We express the determinants using the following notation:

Where $\boldsymbol{A}_{12}$ represents a sub-matrix with $1$st row and $2$nd column removed from the original matrix $\boldsymbol{A}$. The inductive assumption is that the cofactor expansion along the first row is equal to the cofactor expansion along the first column for the $2\times2$ case. We will use this assumption in the very next step.

Let's start by focusing on $C'_{\mathrm{row}=1}$. We compute the first determinant in \eqref{eq:TwW0mYFQngeOY8JOuu3} by cofactor expansion along the first column:

Here, $\boldsymbol{A}_{{\color{green}12},{\color{red}21}}$ represents the sub-matrix in which the following two pairs of rows and columns are removed from $\boldsymbol{A}$:

the $1$st row and $2$nd column.

the $2$nd row and $1$st column.

Visually, the sub-matrix (the non-colored term) looks like the following:

Notice how if we were to perform cofactor expansion to find the other terms in \eqref{eq:TwW0mYFQngeOY8JOuu3} as we did in \eqref{eq:TgWTPrRpUbT3wFUwe0C}, the terms with the specific combination $a_{12}a_{21}$ and $a_{12}a_{31}$ will only appear once in \eqref{eq:TwW0mYFQngeOY8JOuu3}. This also means that their corresponding determinants $\det(\boldsymbol{A}_{{\color{green}12},{\color{red}21}})$ and $\det(\boldsymbol{A}_{{\color{green}12},{\color{red}31}})$ will also appear once in \eqref{eq:TwW0mYFQngeOY8JOuu3}. In general, the following form will only appear in \eqref{eq:TwW0mYFQngeOY8JOuu3} once:

Let's now move on to $C'_{\mathrm{col}=1}$ in \eqref{eq:WvYWE0w8RWFqo9Y34fc}. Once again, let's focus on the first determinant - but this time, we perform cofactor expansion along the first row:

The same idea holds here - if we were to perform cofactor expansion on the other terms in \eqref{eq:WvYWE0w8RWFqo9Y34fc} as we did in \eqref{eq:gnYpM9eGtvYjQB3RA65}, the terms with the combination $a_{21}a_{12}$ and $a_{21}a_{13}$ will only appear once in \eqref{eq:WvYWE0w8RWFqo9Y34fc}. In general, the following form will appear in \eqref{eq:WvYWE0w8RWFqo9Y34fc} once:

This is similar to \eqref{eq:jTj3w8z7xMFMhkx2ujs} - except that we have $\det(A_{{\color{green}1j},{\color{red}i1}})$ in \eqref{eq:jTj3w8z7xMFMhkx2ujs} but $\det(\boldsymbol{A}_{{\color{green}i1},{\color{red}1j}})$ in \eqref{eq:QzQT6OUtX3efWjMkakK}. We now show that these two determinants are equal.

Recall that $\boldsymbol{A}_{{\color{green}1j},{\color{red}i1}}$ represents the sub-matrix after the following row/column removal from $\boldsymbol{A}$:

removing row $1$ and column $j$ first.

removing row $i$ and column $1$ after.

The ordering by which we perform the above removal does not matter. This means that the above sub-matrix $\boldsymbol{A}_{{\color{green}1j},{\color{red}i1}}$ is equal to the sub-matrix obtained by:

removing row $i$ and column $1$ first.

removing row $1$ and column $j$ after.

This sub-matrix is $\boldsymbol{A}_ {{\color{green}i1},{\color{red}1j}}$. Therefore, we conclude that:

Let's now go over some examples to demonstrate \eqref{eq:CiWC4e3N8IjRMlzO5Ij}. The sub-matrix $\boldsymbol{A}_{ \color{green}21,\color{red}12}$ is equal to $\boldsymbol{A}_{{\color{green}21},{\color{red}12}}$ as shown below:

Here's an example for the $4\times4$ case:

Now taking the determinant of both matrices in \eqref{eq:CiWC4e3N8IjRMlzO5Ij} gives:

Therefore, we can now equate \eqref{eq:jTj3w8z7xMFMhkx2ujs} and \eqref{eq:QzQT6OUtX3efWjMkakK} to get:

Remember, the number of times the left-hand side of \eqref{eq:SbUQL1xg9lqsFYgdMIE} appears in $C'_{\mathrm{row}=1}$ in \eqref{eq:TwW0mYFQngeOY8JOuu3} and the number of times the right-hand side of \eqref{eq:SbUQL1xg9lqsFYgdMIE} appears in $C'_{\mathrm{col}=1}$ in \eqref{eq:WvYWE0w8RWFqo9Y34fc} are the same. Therefore, we conclude that:

It then follows that:

Again, we have managed to prove our proposition for the $3\times3$ case but the flow of the proof for the general case is very similar. This completes the proof.