search
Search
Map of Data Science
search toc
Thanks for the thanks!
close
account_circle
Profile
exit_to_app
Sign out
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
Doc Search
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Shrink
Navigate to
near_me
Linear Algebra
52 guides
keyboard_arrow_down
1. Vectors
2. Matrices
3. Linear equations
4. Matrix determinant
5. Vector space
6. Special matrices
7. Eigenvalues and Eigenvectors
8. Orthogonality
9. Matrix decomposition
check_circle
Mark as learned
thumb_up
0
thumb_down
0
chat_bubble_outline
0
auto_stories new
settings

# Comprehensive Guide on Transpose of Matrices

schedule Mar 5, 2023
Last updated
local_offer
Linear Algebra
Tags
map
Check out the interactive map of data science
Definition.

# Transpose of a matrix

The transpose of a matrix $\boldsymbol{A}$, denoted as $\boldsymbol{A}^T$, is a matrix obtained by swapping the rows and the columns of $\boldsymbol{A}$. This means that:

• the first row of $\boldsymbol{A}$ becomes the first column of $\boldsymbol{A}^T$.

• the second row of $\boldsymbol{A}$ becomes the second column of $\boldsymbol{A}^T$.

• and so on.

Mathematically, if $\boldsymbol{A}$ is an $m\times{n}$ matrix with entries $a_{ij}$, then the transpose of $\boldsymbol{A}^T$ is an $n\times{m}$ matrix with entries $a_{ji}$. In other words, the entry located at the $i$-th row $j$-th column in $\boldsymbol{A}$ will be located at the $j$-th row $i$-th column in $\boldsymbol{A}^T$.

Example.

# Finding the transpose of matrices

Find the transpose of the following matrices:

$$\boldsymbol{A}= \begin{pmatrix} 1&2\\3&4 \end{pmatrix},\;\;\;\;\; \boldsymbol{B}= \begin{pmatrix} 1&2&3\\4&5&6 \end{pmatrix},\;\;\;\;\; \boldsymbol{C}= \begin{pmatrix} 1&2&3\\4&5&6\\7&8&9 \end{pmatrix}$$

Solution. The transpose of these matrices is:

$$\boldsymbol{A}^T= \begin{pmatrix} 1&3\\2&4 \end{pmatrix},\;\;\;\;\; \boldsymbol{B}^T= \begin{pmatrix} 1&4\\2&5\\3&6 \end{pmatrix},\;\;\;\;\; \boldsymbol{C}^T= \begin{pmatrix} 1&4&7\\2&5&8\\3&6&9 \end{pmatrix}$$

Notice how taking the transpose affects the shape of the matrices:

• a $2\times2$ matrix remains a $2\times2$ matrix.

• a $3\times2$ matrix becomes a $2\times3$ matrix.

• a $3\times3$ matrix remains a $3\times3$ matrix.

In general, if $\boldsymbol{A}$ is an $m\times{n}$ matrix, then $\boldsymbol{A}^T$ will be an $n\times{m}$ matrix. Let's also understand the mathematical definition of the matrix transpose. Observe what happens to the entry $a_{21}=3$ in $\boldsymbol{A}$ after taking the transpose. Since the rows and columns are swapped, this entry is located at $a_{12}$ in $\boldsymbol{A}^T$.

Example.

# Shape of a matrix after taking its transpose

Suppose we take the transpose of a matrix $\boldsymbol{A}$ with $2$ rows and $3$ columns. What would the shape of $\boldsymbol{A}^T$ be?

Solution. Since we are swapping the rows and columns, the transpose of $\boldsymbol{A}$ would have $3$ rows and $2$ columns, that is, $\boldsymbol{A}^T\in\mathbb{R}^{3\times2}$.

# Properties of matrix transpose

Theorem.

## Transpose of a matrix transpose

Taking the transpose of a matrix transpose results in the matrix itself:

$$(\boldsymbol{A}^T)^T=\boldsymbol{A}$$

Proof. By definition, $\boldsymbol{A}^T$ is obtained by swapping the rows and columns of $\boldsymbol{A}$. Taking the transpose of $\boldsymbol{A}^T$ will swap back the rows and columns and so we end up with the original matrix $\boldsymbol{A}$.

To be more mathematically precise, suppose $\boldsymbol{A}$ is as follows:

$$\boldsymbol{A}=\begin{pmatrix} a_{11}&a_{12}&\cdots&a_{1n}\\ a_{21}&a_{22}&\cdots&a_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ a_{m1}&a_{m2}&\cdots&a_{mn} \end{pmatrix}$$

The transpose of $\boldsymbol{A}$ is:

$$\boldsymbol{A}^T=\begin{pmatrix} a_{11}&a_{21}&\cdots&a_{m1}\\ a_{12}&a_{22}&\cdots&a_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ a_{1n}&a_{2n}&\cdots&a_{mn} \end{pmatrix}$$

The transpose of $\boldsymbol{A}^T$ is:

$$(\boldsymbol{A}^T)^T=\begin{pmatrix} a_{11}&a_{12}&\cdots&a_{1n}\\ a_{21}&a_{22}&\cdots&a_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ a_{m1}&a_{m2}&\cdots&a_{mn} \end{pmatrix}= \boldsymbol{A}$$

This completes the proof.

Theorem.

## Transpose of A+B

If $\boldsymbol{A}$ and $\boldsymbol{B}$ are matrices, then:

$$(\boldsymbol{A}+\boldsymbol{B})^T= \boldsymbol{A}^T+\boldsymbol{B}^T$$

Proof. Consider the following matrices:

$$\boldsymbol{A}=\begin{pmatrix} a_{11}&a_{12}&\cdots&a_{1n}\\ a_{21}&a_{22}&\cdots&a_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ a_{m1}&a_{m2}&\cdots&a_{mn} \end{pmatrix},\;\;\;\;\;\; \boldsymbol{B}=\begin{pmatrix} b_{11}&b_{12}&\cdots&b_{1n}\\ b_{21}&b_{22}&\cdots&b_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ b_{m1}&b_{m2}&\cdots&b_{mn} \end{pmatrix}$$

The transpose of $\boldsymbol{A}$ and $\boldsymbol{B}$ is:

$$\boldsymbol{A}^T=\begin{pmatrix} a_{11}&a_{21}&\cdots&a_{m1}\\ a_{12}&a_{22}&\cdots&a_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ a_{1n}&a_{2n}&\cdots&a_{mn} \end{pmatrix},\;\;\;\;\;\; \boldsymbol{B}^T=\begin{pmatrix} b_{11}&b_{21}&\cdots&b_{m1}\\ b_{12}&b_{22}&\cdots&b_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ b_{1n}&b_{2n}&\cdots&b_{mn} \end{pmatrix}$$

The sum $\boldsymbol{A}^T+\boldsymbol{B}^T$ is:

$$$$\label{eq:yJ6VsJMu5sStPhtgG31} \boldsymbol{A}^T+\boldsymbol{B}^T=\begin{pmatrix} a_{11}+b_{11}&a_{21}+b_{21}&\cdots&a_{m1}+b_{m1}\\ a_{12}+b_{12}&a_{22}+b_{22}&\cdots&a_{m2}+b_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ a_{1n}+b_{1n}&a_{2n}+b_{2n}&\cdots&a_{mn}+b_{mn} \end{pmatrix}$$$$

Now, the sum $\boldsymbol{A}+\boldsymbol{B}$ is:

$$\boldsymbol{A}+\boldsymbol{B}=\begin{pmatrix} a_{11}+b_{11}&a_{12}+b_{12}&\cdots&a_{1n}+b_{1n}\\ a_{21}+b_{21}&a_{22}+b_{22}&\cdots&a_{2n}+b_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ a_{m1}+b_{m1}&a_{m2}+b_{m2}&\cdots&a_{mn}+b_{mn} \end{pmatrix}$$

Taking the transpose gives:

$$$$\label{eq:CA2ZWVSz7LnrIMu6mZs} (\boldsymbol{A}+\boldsymbol{B})^T=\begin{pmatrix} a_{11}+b_{11}&a_{21}+b_{21}&\cdots&a_{m1}+b_{m1}\\ a_{12}+b_{12}&a_{22}+b_{22}&\cdots&a_{m2}+b_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ a_{1n}+b_{1n}&a_{2n}+b_{2n}&\cdots&a_{mn}+b_{mn} \end{pmatrix}$$$$

Notice that this matrix is equal to \eqref{eq:yJ6VsJMu5sStPhtgG31}. Therefore, we have that:

$$(\boldsymbol{A}+\boldsymbol{B})^T= \boldsymbol{A}^T+\boldsymbol{B}^T$$

This completes the proof.

Theorem.

# Diagonal entries of a square matrix do not change after taking its transpose

Let $\boldsymbol{A}$ be a square matrix. The diagonal entries of $\boldsymbol{A}$ and $\boldsymbol{A}^T$ are the same.

Proof. Suppose $\boldsymbol{A}$ is an $n\times{n}$ matrix. The diagonals of $\boldsymbol{A}$ is $a_{ii}$ for $i=1,2,\cdots,n$. Taking the transpose involves switching the two subscripts, but since they are identical, we also end up with $a_{ii}$ as the diagonal entries of $\boldsymbol{A}^T$. This completes the proof.

Theorem.

## Transpose of kA where k is a scalar

If $\boldsymbol{A}$ is a matrix and $k$ is any scalar, then:

$$(k\boldsymbol{A})^T=k\boldsymbol{A}^T$$

Proof. Consider the following matrix:

$$\boldsymbol{A}=\begin{pmatrix} a_{11}&a_{12}&\cdots&a_{1n}\\ a_{21}&a_{22}&\cdots&a_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ a_{m1}&a_{m2}&\cdots&a_{mn} \end{pmatrix}$$

If $k$ is any scalar, then the product $k\boldsymbol{A}$ is:

$$k\boldsymbol{A}=\begin{pmatrix} ka_{11}&ka_{12}&\cdots&ka_{1n}\\ ka_{21}&ka_{22}&\cdots&ka_{2n}\\ \vdots&\vdots&\ddots&\vdots\\ ka_{m1}&ka_{m2}&\cdots&ka_{mn} \end{pmatrix}$$

The transpose of $k\boldsymbol{A}$ is:

$$$$\label{eq:VMQuvGftyd4iwd5MPJ0} (k\boldsymbol{A})^T=\begin{pmatrix} ka_{11}&ka_{21}&\cdots&ka_{m1}\\ ka_{12}&ka_{22}&\cdots&ka_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ ka_{1n}&ka_{2n}&\cdots&ka_{mn} \end{pmatrix}$$$$

The transpose of $\boldsymbol{A}$ is:

$$\boldsymbol{A}^T=\begin{pmatrix} a_{11}&a_{21}&\cdots&a_{m1}\\ a_{12}&a_{22}&\cdots&a_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ a_{1n}&a_{2n}&\cdots&a_{mn} \end{pmatrix}$$

The scalar-matrix product $k\boldsymbol{A}^T$ is:

$$$$\label{eq:mEq1tU7quIWC01rlPIr} k\boldsymbol{A}^T=\begin{pmatrix} ka_{11}&ka_{21}&\cdots&ka_{m1}\\ ka_{12}&ka_{22}&\cdots&ka_{m2}\\ \vdots&\vdots&\ddots&\vdots\\ ka_{1n}&ka_{2n}&\cdots&ka_{mn} \end{pmatrix}$$$$

This is equal to the matrix in \eqref{eq:VMQuvGftyd4iwd5MPJ0}. Therefore, we conclude that:

$$(k\boldsymbol{A})^T=k\boldsymbol{A}^T$$

This completes the proof.

Theorem.

## Expressing dot product using matrix notation

If $\boldsymbol{v}$ and $\boldsymbol{w}$ are vectors, then:

$$\boldsymbol{v}\cdot{\boldsymbol{w}} \;\;\;{\color{green}=}\;\;\;\boldsymbol{v}^T\boldsymbol{w} \;\;\;{\color{green}=}\;\;\;\boldsymbol{w}^T\boldsymbol{v} \;\;\;{\color{green}=}\;\;\;\boldsymbol{v}\boldsymbol{w}^T$$

Proof. We will prove the case for $\mathbb{R}^3$ but this easily generalizes to $\mathbb{R}^n$. Let vectors $\boldsymbol{v}$ and $\boldsymbol{w}$ be defined as follows:

$$\boldsymbol{v}= \begin{pmatrix} v_1\\ v_2\\ v_3\\ \end{pmatrix},\;\;\;\;\; \boldsymbol{w}= \begin{pmatrix} w_1\\ w_2\\ w_3\\ \end{pmatrix}$$

Their dot product is:

\begin{align*} \boldsymbol{v}\cdot{\boldsymbol{w}}&= v_1w_1+v_2w_2+v_3w_3\\ &= \begin{pmatrix} v_1&v_2&v_3 \end{pmatrix} \begin{pmatrix} w_1\\w_2\\w_3 \end{pmatrix}\\ &=\boldsymbol{v}^T\boldsymbol{w} \end{align*}

Similarly, we have that:

\begin{align*} \boldsymbol{v}\cdot{\boldsymbol{w}}&= v_1w_1+v_2w_2+v_3w_3\\ &= \begin{pmatrix} w_1&w_2&w_3 \end{pmatrix} \begin{pmatrix} v_1\\v_2\\v_3 \end{pmatrix}\\ &=\boldsymbol{w}^T\boldsymbol{v} \end{align*}

Similarly, we have that:

\begin{align*} \boldsymbol{v}\cdot{\boldsymbol{w}}&= v_1w_1+v_2w_2+v_3w_3\\ &= \begin{pmatrix} v_1\\v_2\\v_3 \end{pmatrix} \begin{pmatrix} w_1&w_2&w_3 \end{pmatrix}\\ &=\boldsymbol{v}\boldsymbol{w}^T \end{align*}

This completes the proof.

Theorem.

# Matrix product of a vector and its transpose is equal to the vector's squared magnitude

If $\boldsymbol{v}$ is a vector in $\mathbb{R}^n$, then:

$$\boldsymbol{v}^T\boldsymbol{v}= \Vert\boldsymbol{v}\Vert^2$$

Where $\Vert\boldsymbol{v}\Vert$ is the magnitudelink of $\boldsymbol{v}$.

Proof. By the previous propertylink and propertylink of dot product, we have that:

\begin{align*} \boldsymbol{v}^T\boldsymbol{v}&= \boldsymbol{v}\cdot{\boldsymbol{v}}\\ &=\Vert\boldsymbol{v}\Vert^2 \end{align*}

This completes the proof.

Theorem.

## Transpose of a product of two matrices

If $\boldsymbol{A}$ is an $m\times{n}$ matrix and $\boldsymbol{B}$ is an $n\times{p}$ matrix, then:

$$(\boldsymbol{A}\boldsymbol{B})^T= \boldsymbol{B}^T\boldsymbol{A}^T$$

Proof. From the rules of matrix multiplication, we know that:

$$$$\label{eq:hZKaAMSlx63jVMkEVY8} (\boldsymbol{AB})_{ij}= \sum^n_{k=1}a_{ik}\cdot{b_{kj}}$$$$

Here, the subscript $ij$ represents the value in the $i$-th row $j$-th column. \eqref{eq:hZKaAMSlx63jVMkEVY8} is true for all $i=1,2,\cdots,m$ and $j=1,2,\cdots,p$. For instance, $(\boldsymbol{AB})_{13}$ represents the entry at the $1$st row $3$rd column of matrix $\boldsymbol{AB}$.

We know from the definition of transpose that:

$$$$\label{eq:aZZgDF30ZHVz6WidSNK} (\boldsymbol{AB})_{ij}= (\boldsymbol{AB})^T_{ji}$$$$

Equating \eqref{eq:aZZgDF30ZHVz6WidSNK} and \eqref{eq:hZKaAMSlx63jVMkEVY8} gives:

$$$$\label{eq:LMNSG63uKXgNU6nkuDe} \sum^n_{k=1}a_{ik}\cdot{b_{kj}}= (\boldsymbol{AB})^T_{ji}$$$$

Now, consider the following:

\label{eq:ZgdDoK2zrOpYmDYO8yN} \begin{aligned}[b] (\boldsymbol{B}^T\boldsymbol{A}^T)_{ji}&= \sum^n_{k=1}b^T_{jk}\cdot{a^T_{ki}}\\ &=\sum^n_{k=1}b_{kj}\cdot{a_{ik}} \end{aligned}

Here, we used the fact that the entry $b_{jk}$ in $\boldsymbol{B}^T$ is equal to the entry $b_{kj}$ in $\boldsymbol{B}$.

Equating \eqref{eq:LMNSG63uKXgNU6nkuDe} and \eqref{eq:ZgdDoK2zrOpYmDYO8yN} gives:

$$$$\label{eq:Bd1Iy1tDn1Ilz2TpbMa} (\boldsymbol{B}^T\boldsymbol{A}^T)_{ji}= (\boldsymbol{AB})^T_{ji}$$$$

Equation \eqref{eq:Bd1Iy1tDn1Ilz2TpbMa} holds true for all $j=1,2,\cdots,p$, and $i=1,2,\cdots,m$. Since the values of two matrices $\boldsymbol{AB}^T$ and $\boldsymbol{B}^T\boldsymbol{A}^T$ are equivalent, these matrices are equivalent, that is:

$$(\boldsymbol{AB})^T= \boldsymbol{B}^T\boldsymbol{A}^T$$

This completes the proof.

Theorem.

## Transpose of a product of three matrices

If $\boldsymbol{A}$, $\boldsymbol{B}$ and $\boldsymbol{C}$ are matrices, then:

$$(\boldsymbol{ABC})^T= \boldsymbol{C}^T\boldsymbol{B}^T\boldsymbol{A}^T$$

Proof. The proof makes use of our previous theorem $(\boldsymbol{AB})^T=\boldsymbol{B}^T\boldsymbol{A}^T$. We start from the left-hand side of our proposition:

\begin{align*} (\boldsymbol{ABC})^T &=\Big((\boldsymbol{AB})\boldsymbol{C}\Big)^T\\ &=\boldsymbol{C}^T(\boldsymbol{AB})^T\\ &=\boldsymbol{C}^T\boldsymbol{B}^T\boldsymbol{A}^T \end{align*}

This completes the proof.

We will now generalize this theorem for any number of matrices. The proof uses induction and follows a similar logic.

Theorem.

## Transpose of a product of n matrices

If $\boldsymbol{A}_i$ for $i=1,2,\cdots,n$ are matrices, then:

$$(\boldsymbol{A}_1 \boldsymbol{A}_2 \cdots \boldsymbol{A}_n)^T= \boldsymbol{A}_n^T \cdots \boldsymbol{A}_2^T \boldsymbol{A}_1^T$$

Proof. We will prove the theorem by induction. Consider the base case when we have $2$ matrices. We have already shown in theoremlink that:

$$(\boldsymbol{A}_1\boldsymbol{A}_2)^T= \boldsymbol{A}_2^T\boldsymbol{A}_1^T$$

Therefore, the base case holds. We now assume that the theorem holds for $n-1$ matrices:

$$$$\label{eq:IHQoYjMe8rq0BGQ1T2f} (\boldsymbol{A}_1 \boldsymbol{A}_2\cdots \boldsymbol{A}_{n-1} )^T= \boldsymbol{A}_{n-1}^T\cdots\boldsymbol{A}_2^T\boldsymbol{A}_1^T$$$$

Our goal is to show that the theorem holds for $n$ matrices:

\begin{align*} (\boldsymbol{A}_1 \boldsymbol{A}_2\cdots \boldsymbol{A}_{n-1} \boldsymbol{A}_{n} )^T&= \Big((\boldsymbol{A}_1 \boldsymbol{A}_2\cdots \boldsymbol{A}_{n-1}) \boldsymbol{A}_{n}\Big)^T\\ &=\boldsymbol{A}_{n}^T(\boldsymbol{A}_1 \boldsymbol{A}_2\cdots \boldsymbol{A}_{n-1})^T \end{align*}

We now use the inductive assumption \eqref{eq:IHQoYjMe8rq0BGQ1T2f} to get:

$$(\boldsymbol{A}_1 \boldsymbol{A}_2\cdots \boldsymbol{A}_{n-1} \boldsymbol{A}_{n} )^T= \boldsymbol{A}_{n}^T \boldsymbol{A}_{n-1}^T\cdots\boldsymbol{A}_2^T \boldsymbol{A}_1^T$$

By the principle of mathematical induction, the theorem holds for the general case of $n$ matrices. This completes the proof.

The next theorem is useful for certain proofs.

Theorem.

## Another expression for the dot product of Av and w

If $\boldsymbol{A}$ is an $m\times{n}$ matrix and $\boldsymbol{v}\in\mathbb{R}^n$ and $\boldsymbol{w}\in\mathbb{R}^{m}$ are vectors, then:

$$\boldsymbol{A}\boldsymbol{v}\cdot\boldsymbol{w}= \boldsymbol{v}\cdot\boldsymbol{A}^T\boldsymbol{w}$$

Proof. The matrix-vector product $\boldsymbol{A}\boldsymbol{v}$ results in a vector. We can use theoremlink to convert the dot product of $\boldsymbol{Av}$ and $\boldsymbol{w}$ into a matrix product:

\begin{align*} \boldsymbol{A}\boldsymbol{v}\cdot\boldsymbol{w}&= \boldsymbol{w}^T(\boldsymbol{A}\boldsymbol{v})\\ &=(\boldsymbol{w}^T\boldsymbol{A})\boldsymbol{v}\\ &=(\boldsymbol{A}^T\boldsymbol{w})^T\boldsymbol{v}\\ &=\boldsymbol{v}\cdot\boldsymbol{A}^T\boldsymbol{w} \\ \end{align*}

Note the following:

• the third step uses theoremlink, that is, $(\boldsymbol{A}\boldsymbol{B})^T= \boldsymbol{B}^T\boldsymbol{A}^T$.

• the final step uses theoremlink to convert the matrix product into a dot product.

This completes the proof.

Edited by 0 others
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!