**Linear Algebra**

# Comprehensive Guide on Orthogonal Complement

*schedule*Aug 12, 2023

*toc*Table of Contents

*expand_more*

**mathematics behind data science**with 100+ top-tier guides

Start your free 7-days trial now!

# Orthogonal complement

If $W$ is a subspace of $\mathbb{R}^n$, then the set of vectors in $\mathbb{R}^n$ that are orthogonal to every vector in $W$ is called the orthogonal complement, often denoted as $W^\perp$.

## Orthogonal complement of a line in R2

We know from this examplelink in our guide on subspace that a set of vectors on the line $W$ that passes through the origin is a subspace of $\mathbb{R}^2$. The orthogonal complement $W^\perp$ is shown below:

Notice how all the vectors in $W^\perp$ like the one drawn are perpendicular to any vector in $W$.

# Orthogonal complement is a subspace

If $W$ is a subspace of $\mathbb{R}^n$, then the orthogonal complement $W^\perp$ is a subspace of $\mathbb{R}^n$ as well.

Proof. To prove that $W^\perp$ is also a subspace, we must show that $W^\perp$ is closed under addition and scalar multiplication. Suppose vectors $\boldsymbol{w}_1$ and $\boldsymbol{w}_2$ are in $W^\perp$. By definition of orthogonal complements, any vector $\boldsymbol{v}$ in $W$ is perpendicular to $\boldsymbol{w}_1$ and $\boldsymbol{w}_2$. This means that:

Adding them gives:

This means that the vector $\boldsymbol{w}_1+\boldsymbol{w}_2$ is also orthogonal to $\boldsymbol{v}$. In other words, $\boldsymbol{w}_1+\boldsymbol{w}_2$ also resides in $W^\perp$, which means that $W^\perp$ is closed under addition.

Next, let's check if $W^\perp$ is closed under scalar multiplication. Let $\boldsymbol{w}$ be a vector in $W^\perp$ and $k$ be some scalar. Again, any vector $\boldsymbol{v}$ in $W$ must be perpendicular to $\boldsymbol{w}$ by definition of orthogonal complements. Therefore, we have that:

Multiplying $k$ to both sides gives:

This means that $k\boldsymbol{w}$ must be orthogonal to $\boldsymbol{v}$, and so $k\boldsymbol{w}$ must be in $W^\perp$. Therefore, $W^\perp$ is closed under scalar multiplication.

Because $W^\perp$ is closed under addition and scalar multiplication, we have that $W^\perp$ is a subspace of $V$. This completes the proof.

# Intersection of a subspace and its orthogonal complement is the zero vector

If $W$ is a subspace of $\mathbb{R}^n$ and $W^\perp$ is an orthogonal complement, then the only vector that is contained in both $W$ and $W^\perp$ is the zero vector $\boldsymbol{0}$. This is often written as $W\cap{W^\perp}=\{\boldsymbol{0}\}$.

Proof. Let $\boldsymbol{v}$ be a vector contained in both $W$ and $W^\perp$. By definition, this means that $\boldsymbol{v}$ must be orthogonal to itself, that is:

By theoremlink, we have that $\boldsymbol{v}\cdot\boldsymbol{v}= \Vert\boldsymbol{v}\Vert^2$. Therefore, \eqref{eq:Y4hb3BFWL0DvjL29INI} becomes:

The only way for this to be true is if $\boldsymbol{v}$ is the zero vector. This completes the proof.

# Orthogonality of null space and row space (1)

If $\boldsymbol{A}$ is an $m\times{n}$ matrix, then every vector in the null space of $\boldsymbol{A}$ is orthogonal to every vector in the column space of $\boldsymbol{A}^T$, that is:

Proof. Let $\boldsymbol{v}$ be a vector in the null space of $\boldsymbol{A}$. This means that:

Let $\boldsymbol{w}$ be a vector in the column space of $\boldsymbol{A}^T$. Let the column vectors of $\boldsymbol{A}^T$ be:

Because $\boldsymbol{w}$ is in the column space of $\boldsymbol{A}^T$, we know that $\boldsymbol{w}$ can be expressed as a linear combination of the column vectors of $\boldsymbol{A}^T$, that is:

Where $c_1$, $c_2$, $\cdots$, $c_m$ are real numbers. Using theoremlink, we can rewrite \eqref{eq:ozC6kWTfD1iVWt2QRuD} as:

Where vector $\boldsymbol{c}$ holds $c_1$, $c_2$, $\cdots$, $c_m$.

Now, let's take the dot product of $\boldsymbol{v}$ and $\boldsymbol{w}$ to get:

Note the following:

for the 1st step, we expressed the dot product as a matrix product.

for the 2nd step, we used \eqref{eq:IUwDETw6WjU6WmU1TKa}.

for the 3rd step, we used theoremlink, that is, $(\boldsymbol{AB})^T=\boldsymbol{B}^T\boldsymbol{A}^T$.

for the 4th step, we used \eqref{eq:aSlJ4aYXtf1iXjCL4tO}.

Because the dot product of $\boldsymbol{v}$ and $\boldsymbol{w}$ is zero, we know that $\boldsymbol{v}$ and $\boldsymbol{w}$ are perpendicular by theoremlink. This completes the proof.

# Orthogonality of null space and row space (2)

If $\boldsymbol{A}$ is an $m\times{n}$ matrix, then every vector in the null space of $\boldsymbol{A}^T$ is orthogonal to every vector in the column space of $\boldsymbol{A}$, that is:

Proof. The flow of the proof is very similar to that of theoremlink. Let $\boldsymbol{v}$ be a vector in the null space of $\boldsymbol{A}^T$. By definition of null space, we have that:

Let $\boldsymbol{w}$ be a vector in the column space of $\boldsymbol{A}$. This means that $\boldsymbol{w}$ can be obtained like so:

Where $\boldsymbol{c}$ is some vector containing real numbers. Now, we take the dot product of $\boldsymbol{v}$ and $\boldsymbol{w}$ to get:

Because the dot product of $\boldsymbol{v}$ and $\boldsymbol{w}$ is zero, we have that $\boldsymbol{v}$ and $\boldsymbol{w}$ are perpendicular. This completes the proof.

# Sum of the dimension of a subspace and its complement

If $W$ is a subspace of $\mathbb{R}^n$ and $W^\perp$ is the orthogonal complement of $W$, then:

Proof. Let $W$ be a subspace of $\mathbb{R}^n$ with basis $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_k\}$. By definitionlink, the dimension of a vector space is equal to the number of basis vectors of the vector space. Because there are $k$ basis vectors, the dimension of $W$ is $k$, that is:

Let's now find the dimension of the orthogonal complement $W^\perp$. Let $\boldsymbol{A}$ be a matrix whose columns are the basis vectors $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_k$. Since these basis vectors are in $\mathbb{R}^n$, the shape of $\boldsymbol{A}$ is $n\times{k}$.

The column space of $\boldsymbol{A}$ is defined as the span of its column vectors, which in this case form a basis for $W$. Therefore, we have that:

Now, consider $\boldsymbol{A}^T$, which has the shape $k\times{n}$. We know from the rank-nullity theoremlink that:

From theoremlink, $\mathrm{rank}(\boldsymbol{A}^T)=\mathrm{rank}(\boldsymbol{A})$. Therefore, \eqref{eq:ixU9gDtwWTO8kvYKsKm} becomes:

By definition, the rank of $\boldsymbol{A}$ is equal to the dimension of the column space of $\boldsymbol{A}$, that is:

Substituting \eqref{eq:CxMxt9DV4W7UnrqyqhF} into \eqref{eq:M2NyVgEP75ctpiIfOlb} gives:

Next, the nullity of $\boldsymbol{A}^T$ is defined as the dimension of the null space of $\boldsymbol{A}^T$, that is:

By theoremlink, we have that $\mathrm{nullspace}(\boldsymbol{A}^T) = \big(\mathrm{col}(\boldsymbol{A})\big)^\perp$. Therefore, \eqref{eq:D1CHaLGWUeqs0YusZQN} becomes:

Substituting \eqref{eq:CxMxt9DV4W7UnrqyqhF} into \eqref{eq:VQN8hBfs8CMcu5grmae} gives:

Finally, substituting \eqref{eq:Yew07DZOZcuyPGGvpHK} and \eqref{eq:OnopLjC6zKOlzmHRX7R} into \eqref{eq:Hj97Hppg8vjxSxMMBV1} gives:

This completes the proof.

# Direct sum

Let $W$ and $W'$ be subspaces of a vector space $V$ such that they both include the zero vector, that is, $W\cap{W'}=\{\boldsymbol{0}\}$. The direct sum of $W$ and $W'$, denoted as $W\oplus{W'}$, is defined as:

# Expressing a vector using a subspace and its orthogonal complement

If $W$ is a finite-dimensional subspace of the vector space $V$, then:

This means that any vector in $V$ can be expressed by summing some vector in $W$ and $W^\perp$.

Proof. Let $W$ and $W^\perp$ be a subspace of $\mathbb{R}^n$. By theoremlink, we know the following:

Suppose we have the following:

let $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_k\}$ be the basis for the subspace $W$.

let $\{\boldsymbol{w}_1,\boldsymbol{w}_2,\cdots,\boldsymbol{w}_{n-k}\}$ be the basis for the subspace $W^\perp$.

Let $\boldsymbol{v}\in{W}$ and $\boldsymbol{w}\in{W^\perp}$. By definitionlink, the basis vectors span their respective vector space, so we can express $\boldsymbol{v}$ and $\boldsymbol{w}$ as a linear combination of basis vectors:

The vector $\boldsymbol{v}+\boldsymbol{w}$ is:

Our goal now is to show that $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots, \boldsymbol{v}_k,\boldsymbol{w}_1,\boldsymbol{w}_2,\cdots ,\boldsymbol{w}_{n-k}\}$ is a basis for $V$. Let's start by checking that this is a linearly independent set using the definitionlink of linear independence:

Let's move all the vectors in $W^\perp$ to the right-hand side:

Here:

the left-hand side is $\boldsymbol{v}$, which is a vector in $W$.

the right-hand side is some vector in $W^\perp$.

We know from theoremlink that the only vector that resides in both $W$ and $W^\perp$ is the zero vector. Therefore, the left-hand side and the right-hand side of \eqref{eq:UNgVP8tCvZUoKIyOKlW} must be the zero vector, that is:

Multiplying both sides of the bottom equation by $-1$ gives:

Now, $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_k\}$ is a basis for $W$, which means that the vectors in this set are linearly independent. By definitionlink of linear independence, $c_1$, $c_2$, $\cdots$, $c_k$ must be zero. Similarly, $\{\boldsymbol{w}_1,\boldsymbol{w}_2,\cdots, \boldsymbol{w}_{n-k}\}$ is a basis for $W^\perp$, which again means that the vectors in this set are linearly independent. Therefore, $d_1$, $d_2$, $\cdots$, $d_{n-k}$ must also be zero.

This means that the only way for the equality \eqref{eq:klHFt5KfS93i8SeHCAS} to hold is if all the coefficients on the left are zero. The set $S=\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_k, \boldsymbol{w}_1,\boldsymbol{w}_2,\cdots,\boldsymbol{w}_{n-k}\}$ is therefore a linearly independent set. Since there are a total of $n$ vectors in $S$, we have by the triangular theoremlink that $S$ is a basis for $\mathbb{R}^n$.

Because $S$ is a basis for $\mathbb{R}^n$, any vector in $\mathbb{R}^n$ can be expressed as some linear combination of the vectors in $S$. Remember that the vectors in $S$ are simply the basis vectors of $W$ and $W^\perp$. Therefore, any vector $\boldsymbol{x}$ in $\mathbb{R}^n$ can be written as a sum of vectors in $W$ and $W^\perp$ like so:

This completes the proof.

# Unique vector representation using subspace and its orthogonal complement

The vector representation of $V=W\oplus{W}^\perp$ is unique.

Proof. Let vector $\boldsymbol{x}\in{V}$, $\boldsymbol{v}_1,\boldsymbol{v}_2\in{W}$ and $\boldsymbol{w}_1,\boldsymbol{w}_2\in{W}^\perp$. Suppose that the representation is not unique, that is, there exist two distinct representations of $\boldsymbol{x}$ like so:

Equating the two equations and rearranging gives:

Since $W$ is a subspace and thus closed by addition, we have that vector $\boldsymbol{v}_1-\boldsymbol{v}_2\in{W}$. Similarly, $\boldsymbol{w}_2-\boldsymbol{w}_1\in{W}^\perp$.

By theoremlink, we know that the only vector that resides both in $W$ and $W^\perp$ is the zero vector. For the equality in \eqref{eq:iiHHgNssxJcb31Yzyx3} to hold, the left-hand side and the right-hand side must be the zero vector:

From \eqref{eq:cf13KJaJeZ81VeQvrmM}, we conclude that the two representations must be the same. This completes the proof.

# Orthogonal complement of the orthogonal complement

If $W$ is a subspace of $\mathbb{R}^n$ and $W^\perp$ is its orthogonal complement, then the orthogonal complement of $W^\perp$ is $W$. Mathematically, this translates to:

Proof. Let $\boldsymbol{x}\in(W^\perp)^\perp$. By theoremlink, any vector in $\mathbb{R}^n$ can be expressed as a sum of a vector in $W$ and $W^\perp$. Suppose $\boldsymbol{v}\in{W}$ and $\boldsymbol{w}\in{W}^\perp$ such that:

Let's take the dot product with $\boldsymbol{w}$ on both sides to get:

We know that $\boldsymbol{x}\cdot\boldsymbol{w}=0$ because $\boldsymbol{x}\in(W^\perp)^\perp$ and $\boldsymbol{w}\in{W}^\perp$. Therefore, we get:

We know that $\boldsymbol{v}\cdot\boldsymbol{w}=0$ because $\boldsymbol{v}\in{W}$ and $\boldsymbol{w}\in{W}^\perp$. Next, by theoremlink, $\boldsymbol{w}\cdot\boldsymbol{w}= \Vert\boldsymbol{w}\Vert^2$. Therefore, we end up with:

This implies that $\boldsymbol{w}$ must be equal to the zero vector. We now go back to \eqref{eq:Sg6OytqGqOnnw6BEoyh} to get:

Since $\boldsymbol{x}\in(W^\perp)^\perp$ and $\boldsymbol{v}\in{W}$, we conclude that $(W^\perp)^\perp=W$. This completes the proof.