Moment Generating Function
Start your free 7-days trial now!
Before reading this guide, please make sure you're familiar with the concept of moments. If not, please check out the first part of our guide (TODO) here first.
What is moment generating function?
We can describe a random variable $X$ using numerical measures such as its mean and standard deviation. However, these numerical measures do not uniquely capture the distribution of $X$ since many different distributions can still possess the same mean and standard deviation. One way of uniquely characterizing a distribution is by referring to its moment-generating function.
This makes moment-generating functions extremely useful when proving certain theorems such as the reproductive property of the normal distribution.
Moment-generating function
The moment-generating function $M_X(t)$ of a random variable $X$ is defined as:
Finding the moment-generating function of a discrete random variable
Find the moment-generating function of a discrete random variable $X$ with the following probability mass function:
Solution. Using the definition of moment-generating function:
Finding the moment-generating function of the standard normal distribution
Suppose a random variable $X$ follows the standard normal distribution:
Prove that the moment-generating function of $X$ is:
Solution. Using the definition of moment-generating function:
Now, here's the 🤯 step - the function inside the integral is actually a normal distribution with mean $t$ and variance $1$. Since the area of all probability density functions must equal one, the integral above will equal one:
This completes the proof.
Derivative of moment-generating function
Let $X$ be any random variable. The $k$-th derivative of the moment-generating function of $X$ evaluated at $t=0$ gives the $k$-th moment of $X$ about the origin:
If you don't know what the $k$-th moment of $X$ about the origin means, please consult the the first section of our guide on moments. TODO
Proof. From Taylor's series, we know that $e^{tx}$ can be represented as follows:
Let's now take expected value of both sides:
Using the definition of expected value:
Now, let's define $\mu'_i$ like so:
Using \eqref{eq:v5cgn94QCe5jSpdeTpN}, we can write \eqref{eq:E8GLN2o41NCrdpqjVLd} as follows:
The magic ✨ happens when we start taking the derivatives of $M_X(t)$. Let' use the expression of $M_X(t)$ in \eqref{eq:E8GLN2o41NCrdpqjVLd}. The first derivative of $M_X(t)$ is:
If we set $t=0$, then we have:
We get the first moment of a random variable about the origin!
Now, lets take the second derivative:
If we set $t=0$:
We now get the second moment of a random variable about the origin!
Here, we can easily see where the moment-generating function gets its’ name from. The moment-generating functions can generate the $k$-th moment of a random variable about the origin! For the general case, we observe the following:
This completes the proof.
Deriving mean and variance of standard normal random variables
Let $X$ be a random variable drawn from a standard normal distribution. Using the derivative of moment-generating function, compute the mean $\mathbb{E}(X)$ and variance $\mathbb{V}(X)$.
Solution. In our previous example question, we have derived the moment-generating function of a standard normal random variable $X$ to be:
Let's take the first derivative of the moment generating function:
From theorem, we know that setting $t=0$ will give us the first moment of $X$ about the origin:
Let's now take the second de
Once again, from theorem, we know that setting $t=0$ for $M_X^{(2)}(t)$ will give us the second moment of $X$ about the origin:
Now, we know from the theorem that the variance can be computed as:
Therefore, as we would expect, the mean and variance of a standard normal variable $X$ is:
Properties of moment-generating functions
Moment-generating function of random variable X+a
Let $X$ be any random variable and $a$ be some scalar. The moment-generating function of $X+a$ can be expressed as:
Proof. We can use the definition of moment generating functions to prove this:
This completes the proof.
Moment-generating function of random variable aX
Let $X$ be any random variable and $a$ be some scalar. The moment-generating function of $aX$ can be expressed as:
Proof. We can use the definition of moment generating functions to prove this:
This completes the proof.
Moment-generating function of a sum of random variables
If $X_1$, $X_2$, $\cdots$, $X_n$ are independent random variables with moment-generating function $M_{X_1}(t)$ $M_{X_2}(t)$, $\cdots$, and $M_{X_n}(t)$ respectively and $Y=X_1+X_2+⋯+X_n$, then we have that:
Proof. Once again, we use the definition of moment-generating functions:
Since $X_1$, $X_2$, $\cdots$, $X_n$ are independent:
This completes the proof.
Uniqueness Theorem
Let $X$ and $Y$ be two random variables with moment-generating functions $M_X(t)$ and $M_Y(t)$ respectively. If $M_X(t)=M_Y(t)$ for all values of $t$, then $X$ and $Y$ have the same probability distribution. In other words, if the moment-generating function of two random variables are the same, then they must have the same probability distribution.
The proof of the uniqueness theorem is quite complex and requires graduate-level mathematics, so we will omit the proof here 😞. The uniqueness theorem is one of most important properties of the moment-generating function and is used to prove theorems such as the reproductive property of the normal distribution (TODO)!