search
Search
Map of Data Science
search toc
Thanks for the thanks!
close
account_circle
Profile
exit_to_app
Sign out
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview Doc Search Code Search Beta SORRY NOTHING FOUND!
mic
Start speaking... Voice search is only supported in Safari and Chrome.
Shrink
Navigate to
1. Vectors
2. Matrices
3. Linear equations
4. Matrix determinant
5. Vector space
6. Special matrices
7. Eigenvalues and Eigenvectors
8. Orthogonality
9. Matrix decomposition
check_circle
Mark as learned
thumb_up
0
thumb_down
0
chat_bubble_outline
0
auto_stories new
settings

# Comprehensive Guide on Column Space in Linear Algebra

schedule Mar 5, 2023
Last updated
local_offer
Linear Algebra
Tags
map
Check out the interactive map of data science
Definition.

# Column space

Let $\boldsymbol{A}$ be any $m\times{n}$ matrix:

\begin{align*} \boldsymbol{A}= \begin{pmatrix} \vert&\vert&\vert&\vert\\ \boldsymbol{a}_1&\boldsymbol{a}_2&\cdots&\boldsymbol{a}_n\\ \vert&\vert&\vert&\vert\\ \end{pmatrix} \end{align*}

The column space or range of an $m\times{n}$ matrix $\boldsymbol{A}$, denoted by $\mathrm{col}(\boldsymbol{A})$, is the span of its column vectors, that is:

\begin{align*} \mathrm{col}(\boldsymbol{A})&= \mathrm{span} (\boldsymbol{a}_1,\boldsymbol{a}_2,\cdots,\boldsymbol{a}_n) \end{align*}

Note that span is defined as the set of all the vectors that can be constructed using a linear combination of $\boldsymbol{a}_1$, $\boldsymbol{a}_2$, $\cdots$, $\boldsymbol{a}_n$, that is:

$$\mathrm{col}(\boldsymbol{A}) =\{ c_1\boldsymbol{a}_1+c_2\boldsymbol{a}_2+\cdots+c_n\boldsymbol{a}_n \;|\; c_1,c_2,\cdots,c_n\in\mathbb{R} \}$$
Example.

## Finding the column space of a matrix (1)

Consider the following matrix:

$$\boldsymbol{A}= \begin{pmatrix} 2&4\\ 5&3 \end{pmatrix}$$

Find the column space of $\boldsymbol{A}$.

Solution. By definitionlink, the column space of a matrix is the span of its column vectors. The column space of $\boldsymbol{A}$ is:

$$\mathrm{col}(\boldsymbol{A})=\mathrm{span} \left( \begin{pmatrix}2\\5\end{pmatrix},\; \begin{pmatrix}4\\3\end{pmatrix} \right)$$

Notice how the two vectors are linearly independent. By theoremlink, we know that two linearly independent vectors in $\mathbb{R}^2$ span $\mathbb{R}^2$, which means that the column space of $\boldsymbol{A}$ is the entire $\mathbb{R}^2$.

Example.

## Finding the column space of a matrix (2)

Consider the following matrix:

$$\boldsymbol{A}= \begin{pmatrix} 2&3\\ 2&3 \end{pmatrix}$$

Find the column space of $\boldsymbol{A}$.

Solution. The column space of $\boldsymbol{A}$ is the span of its column vectors:

$$\mathrm{col}(\boldsymbol{A})= \mathrm{span} \left( \begin{pmatrix}2\\2\end{pmatrix},\; \begin{pmatrix}3\\3\end{pmatrix} \right)$$

By the plus-minus theoremlink, since the column vectors of $\boldsymbol{A}$ are linearly dependent, we can remove one vector and still preserve the same span:

$$\mathrm{col}(\boldsymbol{A})= \mathrm{span} \left( \begin{pmatrix}2\\2\end{pmatrix} \right)$$

This means that the column space of $\boldsymbol{A}$ is a line traced out by the first column vector. Visually, the column space of $\boldsymbol{A}$ looks like follows: For instance, the following vector is a particular element of the column space of $\boldsymbol{A}$:

$$\begin{pmatrix} 5\\5 \end{pmatrix}\in \mathrm{col}(\boldsymbol{A})$$
Theorem.

# The column space of a matrix is a subspace

The column space of an $m\times{n}$ matrix is a subspace of $\mathbb{R}^m$.

Proof. The column space of an $m\times{n}$ matrix $\boldsymbol{A}$ is defined as the span of its column vectors. Therefore, the column space is a subset of $\mathbb{R}^m$. To show that the column space is a subspace, we must show that the column space of $\boldsymbol{A}$ is closed under addition and scalar multiplication.

Let $\boldsymbol{v}$ and $\boldsymbol{w}$ be any two elements in the column space of $\boldsymbol{A}$. This means that there exist some vectors $\boldsymbol{x}$ and $\boldsymbol{y}$ such that:

\begin{align*} \boldsymbol{A}\boldsymbol{x}&=\boldsymbol{v}\\ \boldsymbol{A}\boldsymbol{y}&=\boldsymbol{w}\\ \end{align*}

Now, let's check if vector $\boldsymbol{v}+\boldsymbol{w}$ is contained in the column space of $\boldsymbol{A}$ like so:

\begin{align*} \boldsymbol{v}+\boldsymbol{w} &=\boldsymbol{Ax}+\boldsymbol{Ay}\\ &=\boldsymbol{A}(\boldsymbol{x}+\boldsymbol{y})\\ \end{align*}

Since $\boldsymbol{A}(\boldsymbol{x}+\boldsymbol{y})$ results in $\boldsymbol{v}+\boldsymbol{w}$, we conclude that $\boldsymbol{v}+\boldsymbol{w}$ is contained in the column space of $\boldsymbol{A}$. This means that the column space of $\boldsymbol{A}$ is closed under addition.

Next, suppose vector $\boldsymbol{v}$ is an element of the column space of $\boldsymbol{A}$. This means that there exists some vector $\boldsymbol{x}$ such that $\boldsymbol{Ax}=\boldsymbol{v}$. Now, consider a scalar $k$. Let's check if $k\boldsymbol{v}$ is included in the column space of $\boldsymbol{A}$ like so:

\begin{align*} k\boldsymbol{v} &=k(\boldsymbol{Ax})\\ &=\boldsymbol{A}(k\boldsymbol{x}) \end{align*}

This means that the column space of $\boldsymbol{A}$ is closed under scalar multiplication. Therefore, by definition, the column space of $\boldsymbol{A}$ is a subspace of $\mathbb{R}^m$. This completes the proof.

Theorem.

# Relationship between consistent linear system and column space

A system of linear equations $\boldsymbol{Ax}=\boldsymbol{b}$ is consistentlink if and only if $\boldsymbol{b}$ is contained in the column space of $\boldsymbol{A}$.

Proof. By definitionlink, the column space of an $m\times{n}$ matrix $\boldsymbol{A}$ is the span of its column vectors, that is, the set of all linear combinations of the column vectors:

$$\mathrm{col}(\boldsymbol{A}) =\{ x_1\boldsymbol{a}_1+x_2\boldsymbol{a}_2+\cdots+x_n\boldsymbol{a}_n \;|\; x_1,x_2,\cdots,x_n\in\mathbb{R} \}$$

Also, by definitionlink, if the system of linear equations $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent, then there exists at least one solution $\boldsymbol{x}$ such that the product $\boldsymbol{Ax}$ yields $\boldsymbol{b}$. We know from theoremlink that $\boldsymbol{Ax}$ can be written as a linear combination of the column vectors of $\boldsymbol{A}$, that is:

\begin{equation}\label{eq:ZIFT38Fn4iG3GO9QloM} \begin{aligned}[b] \boldsymbol{Ax}=\boldsymbol{b} \;\;\Longleftrightarrow\;\; \begin{pmatrix} \vert&\vert&\vert&\vert\\ \boldsymbol{a}_1&\boldsymbol{a}_2&\cdots&\boldsymbol{a}_n\\ \vert&\vert&\vert&\vert\\ \end{pmatrix} \begin{pmatrix} x_1\\x_2\\\vdots\\x_n \end{pmatrix}=\boldsymbol{b} \;\;\Longleftrightarrow\;\; x_1\boldsymbol{a}_1+x_2\boldsymbol{a}_2+\cdots+x_n\boldsymbol{a}_n=\boldsymbol{b} \end{aligned} \end{equation}

Again, given that $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent, we know that there exist scalars $x_1$, $x_2$, $\cdots$, $x_n$ that make the above equality hold. Since $\boldsymbol{b}$ can be expressed as a linear combination of the column vectors of $\boldsymbol{A}$, we have that $\boldsymbol{b}$ belongs to the column space of $\boldsymbol{A}$.

Let's now prove the converse, that is, if $\boldsymbol{b}$ is contained in the column space of $\boldsymbol{A}$, then the system $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent. By definitionlink, since $\boldsymbol{b}$ belongs to the column space of $\boldsymbol{A}$, we know that $\boldsymbol{b}$ can be expressed as a linear combination of the column vectors of $\boldsymbol{A}$, that is:

$$\begin{equation}\label{eq:C4O1JExiiANzJPJw14a} x_1\boldsymbol{a}_1+x_2\boldsymbol{a}_2+\cdots+x_n\boldsymbol{a}_n=\boldsymbol{b} \end{equation}$$

Using the same logic as \eqref{eq:ZIFT38Fn4iG3GO9QloM}, this can be converted into the linear system $\boldsymbol{Ax}=\boldsymbol{b}$. Because we know there exist $x_1$, $x_2$, $\cdots$, $x_n$ that make \eqref{eq:C4O1JExiiANzJPJw14a} hold, we conclude that $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent. This completes the proof.

Theorem.

# Pivot columns of a matrix span the column space of the matrix

The columns in matrix $\boldsymbol{A}$ corresponding to the pivot columnslink in the reduced row echelon form of $\boldsymbol{A}$ span the column space of $\boldsymbol{A}$.

Proof. Suppose $\boldsymbol{A}$ is an $m\times{n}$ matrix. The column space of $\boldsymbol{A}$ is defined as the span of its column vectors:

$$\mathrm{col}(\boldsymbol{A})=\mathrm{span}(\boldsymbol{a}_1,\boldsymbol{a}_2 ,\cdots,\boldsymbol{a}_n )$$

We know from the plus/minus theoremlink that we can remove linearly dependent vectors from the spanning set while preserving the span. From theoremlink, we know that columns in $\boldsymbol{A}$ corresponding to the non-pivot columns can be expressed as a linear combination of the columns in $\boldsymbol{A}$ corresponding to pivot columns. In other words, non-pivot columns are linearly dependent on pivot columns.

Therefore, we can remove all columns in $\boldsymbol{A}$ corresponding to the non-pivot columns from the spanning set while preserving the span. This means that only the columns in $\boldsymbol{A}$ corresponding to the pivot columns are sufficient to span the column space of $\boldsymbol{A}$.

This completes the proof.

Theorem.

# Pivot columns of a matrix form a basis for the column space of the matrix

The columns in matrix $\boldsymbol{A}$ corresponding to the pivot columns of the reduced row echelon formlink of $\boldsymbol{A}$ form a basislink for the column space of $\boldsymbol{A}$.

Proof. From theoremlink, we know that the columns in $\boldsymbol{A}$ corresponding to the pivot columns in $\mathrm{rref}(\boldsymbol{A})$ span the column space of $\boldsymbol{A}$. From theoremlink, we know that the columns in $\boldsymbol{A}$ corresponding to the pivot columns in $\mathrm{rref}(\boldsymbol{A})$ are linearly independent. Therefore, by definitionlink of basis, the columns in $\boldsymbol{A}$ corresponding to the pivot columns in $\mathrm{rref}(\boldsymbol{A})$ form a basis for the column space of $\boldsymbol{A}$. This completes the proof.

thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!