search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

Moment Generating Function

schedule Aug 12, 2023
Last updated
local_offer
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

Before reading this guide, please make sure you're familiar with the concept of moments. If not, please check out the first part of our guide (TODO) here first.

What is moment generating function?

We can describe a random variable $X$ using numerical measures such as its mean and standard deviation. However, these numerical measures do not uniquely capture the distribution of $X$ since many different distributions can still possess the same mean and standard deviation. One way of uniquely characterizing a distribution is by referring to its moment-generating function.

This makes moment-generating functions extremely useful when proving certain theorems such as the reproductive property of the normal distribution.

Definition.

Moment-generating function

The moment-generating function $M_X(t)$ of a random variable $X$ is defined as:

$$M_X(t)=\mathbb{E}(e^{tx})$$
Example.

Finding the moment-generating function of a discrete random variable

Find the moment-generating function of a discrete random variable $X$ with the following probability mass function:

$$p_X(x)= \begin{cases} 0.2\text{ if } x=3\\ 0.8\text{ if } x=4 \end{cases}$$

Solution. Using the definition of moment-generating function:

$$\begin{align*} M_X(t)&=\mathbb{E}(e^{tx})\\ &=\sum^2_{i=1}e^{tx_i}\cdot{p_X(x_i)}\\ &=(e^{3t})(0.2)+(e^{4t})(0.8)\\ &=0.2e^{3t}+0.8e^{4t} \end{align*}$$
Example.

Finding the moment-generating function of the standard normal distribution

Suppose a random variable $X$ follows the standard normal distribution:

$$f_X(x)=\frac{1}{\sqrt{2π}} \mathrm{exp}\left(\frac{-x^2}{2}\right)$$

Prove that the moment-generating function of $X$ is:

$$M_X(t)= \exp\Big(\frac{t^2}{2}\Big)$$

Solution. Using the definition of moment-generating function:

$$\begin{align*} M_X(t)&=\mathbb{E}(e^{tx})\\ &=\int^\infty_{-\infty} \frac{1}{\sqrt{2\pi}}\exp(tx)\cdot\exp\left(\frac{-x^2}{2}\right)\;dx\\ &=\int^\infty_{-\infty} \frac{1}{\sqrt{2\pi}}\exp\left(\frac{-x^2}{2}+tx\right)\;dx\\ &=\int^\infty_{-\infty} \frac{1}{\sqrt{2\pi}}\exp\left(\frac{-x^2+2tx}{2}\right)\;dx\\ &=\int^\infty_{-\infty} \frac{1}{\sqrt{2\pi}}\exp\left(\frac{-(x^2-2tx+t^2)+t^2}{2}\right)\;dx\\ &=\int^\infty_{-\infty} \frac{1}{\sqrt{2\pi}}\exp\left(\frac{-(x-t)^2}{2}+\frac{t^2}{2}\right)\;dx\\ &=\exp\Big(\frac{t^2}{2}\Big) \int^\infty_{-\infty} \frac{1}{\sqrt{2\pi}} \exp\left(\frac{-(x-t)^2}{2}\right)\;dx\\ \end{align*}$$

Now, here's the 🤯 step - the function inside the integral is actually a normal distribution with mean $t$ and variance $1$. Since the area of all probability density functions must equal one, the integral above will equal one:

$$M_X(t)= \exp\Big(\frac{t^2}{2}\Big)$$

This completes the proof.

Theorem.

Derivative of moment-generating function

Let $X$ be any random variable. The $k$-th derivative of the moment-generating function of $X$ evaluated at $t=0$ gives the $k$-th moment of $X$ about the origin:

$$M_X^{(k)}(0)=\mathbb{E}(x^k)=\mu'_k$$

If you don't know what the $k$-th moment of $X$ about the origin means, please consult the the first section of our guide on moments. TODO

Proof. From Taylor's series, we know that $e^{tx}$ can be represented as follows:

$$e^{tx}= \frac{(tx)^0}{0!} +\frac{(tx)^1}{1!} +\frac{(tx)^2}{2!} +\frac{(tx)^3}{3!} +\cdots +\frac{(tx)^n}{n!}$$

Let's now take expected value of both sides:

$$\begin{equation}\label{eq:E8GLN2o41NCrdpqjVLd} \begin{aligned}[b] M_X(t)&= \mathbb{E}(e^{tx})\\&= \mathbb{E}\Big(\frac{(tx)^0}{0!}+\frac{(tx)^1}{1!}+\frac{(tx)^2}{2!} +\cdots +\frac{(tx)^n}{n!}\Big)\\ &= \mathbb{E}\Big(\frac{(tx)^0}{0!}\Big) +\mathbb{E}\Big(\frac{(tx)^1}{1!}\Big) +\mathbb{E}\Big(\frac{(tx)^2}{2!}\Big) +\cdots +\mathbb{E}\Big(\frac{(tx)^n}{n!}\Big)\\ &=\frac{t^0}{0!}\mathbb{E}(x^0) +\frac{t^1}{1!}\mathbb{E}(x^1) +\frac{t^2}{2!}\mathbb{E}(x^2) +\cdots +\frac{t^n}{n!}\mathbb{E}(x^n)\\ \end{aligned} \end{equation}$$

Using the definition of expected value:

$$\begin{equation}\label{eq:ruURQ7S1eURQjBz05Jy} \begin{aligned}[b] M_X(t)&= \frac{t^0}{0!}\mathbb{E}(x^0) +\frac{t^1}{1!}\mathbb{E}(x^1) +\frac{t^2}{2!}\mathbb{E}(x^2) +\cdots +\frac{t^n}{n!}\mathbb{E}(x^n)\\ &=\frac{t^0}{0!}\sum_x(x^0p(x)) +\frac{t^1}{1!}\sum_x(x^1p(x)) +\frac{t^2}{2!}\sum_x(x^2p(x)) +\cdots +\frac{t^n}{n!}\sum_x(x^np(x))\\ \end{aligned} \end{equation}$$

Now, let's define $\mu'_i$ like so:

$$\begin{equation}\label{eq:v5cgn94QCe5jSpdeTpN} \mu'_i= \mathbb{E}(x^i) =\sum_x{x^ip(x)} \end{equation}$$

Using \eqref{eq:v5cgn94QCe5jSpdeTpN}, we can write \eqref{eq:E8GLN2o41NCrdpqjVLd} as follows:

$$\begin{align*} M_X(t)&= \frac{t^0}{0!}\mu'_0 +\frac{t^1}{1!}\mu'_1 +\frac{t^2}{2!}\mu'_2 +\cdots +\frac{t^n}{n!}\mu'_n\\ &= \sum_{i=0}^n\frac{t^i}{i!}\mu'_i \end{align*}$$

The magic ✨ happens when we start taking the derivatives of $M_X(t)$. Let' use the expression of $M_X(t)$ in \eqref{eq:E8GLN2o41NCrdpqjVLd}. The first derivative of $M_X(t)$ is:

$$\begin{align*} M_X^{(1)}(t) &=\frac{d}{dt}\Big( \frac{t^0}{0!}\mathbb{E}(x^0) +\frac{t^1}{1!}\mathbb{E}(x^1) +\frac{t^2}{2!}\mathbb{E}(x^2) +\cdots +\frac{t^n}{n!}\mathbb{E}(x^n)\Big)\\ &= \frac{t^0}{0!}\mathbb{E}(x^1) +\frac{t^1}{1!}\mathbb{E}(x^2) +\frac{t^2}{2!}\mathbb{E}(x^3) +\cdots +\frac{t^{n-1}}{n-1!}\mathbb{E}(x^n) \end{align*}$$

If we set $t=0$, then we have:

$$M^{(1)}_X(0)=\mathbb{E}(x^1)=\mu'_1$$

We get the first moment of a random variable about the origin!

Now, lets take the second derivative:

$$\begin{align*} M_X^{(2)}(t)&= \frac{d}{dt}M_X^{(1)}(t)\\ &=\frac{d}{dt}\Big(\frac{t^0}{0!}\mathbb{E}(x^1) +\frac{t^1}{1!}\mathbb{E}(x^2) +\frac{t^2}{2!}\mathbb{E}(x^3) +\cdots +\frac{t^{n-1}}{n-1!}\mathbb{E}(x^n)\Big)\\ &=\frac{t^0}{0!}\mathbb{E}(x^2) +\frac{t^1}{1!}\mathbb{E}(x^3) +\cdots +\frac{t^{n-2}}{n-2!}\mathbb{E}(x^n) \end{align*}$$

If we set $t=0$:

$$M_X^{(2)}(0)=\mathbb{E}(x^2)=\mu'_2$$

We now get the second moment of a random variable about the origin!

Here, we can easily see where the moment-generating function gets its’ name from. The moment-generating functions can generate the $k$-th moment of a random variable about the origin! For the general case, we observe the following:

$$M_X^{(k)}(0)=\mathbb{E}(x^k)=\mu'_k$$

This completes the proof.

Example.

Deriving mean and variance of standard normal random variables

Let $X$ be a random variable drawn from a standard normal distribution. Using the derivative of moment-generating function, compute the mean $\mathbb{E}(X)$ and variance $\mathbb{V}(X)$.

Solution. In our previous example question, we have derived the moment-generating function of a standard normal random variable $X$ to be:

$$M_X(t)= \exp\Big(\frac{t^2}{2}\Big)$$

Let's take the first derivative of the moment generating function:

$$\begin{align*} M_X^{(1)}(t)&= \frac{d}{dt}\exp\Big(\frac{t^2}{2}\Big)\\ &=t\cdot\exp\Big(\frac{t^2}{2}\Big) \end{align*}$$

From theorem, we know that setting $t=0$ will give us the first moment of $X$ about the origin:

$$\begin{align*} \mathbb{E}(X)&=M_X^{(1)}(0)\\ &=0 \end{align*}$$

Let's now take the second de

$$\begin{align*} M_X^{(2)}(t)&= \frac{d}{dt}(M_X^{(1)}(t))\\ &=\frac{d}{dt}\Big(t\cdot\exp\Big(\frac{t^2}{2}\Big)\Big)\\ &=\exp\Big(\frac{t^2}{2}\Big)+t^2\exp\Big(\frac{t^2}{2}\Big)\\ \end{align*}$$

Once again, from theorem, we know that setting $t=0$ for $M_X^{(2)}(t)$ will give us the second moment of $X$ about the origin:

$$\begin{align*} \mathbb{E}(X^2)&=M_X^{(2)}(0)\\ &= \exp\Big(\frac{0^2}{2}\Big)+(0)^2\exp\Big(\frac{0^2}{2}\Big)\\ &=1 \\ \end{align*}$$

Now, we know from the theorem that the variance can be computed as:

$$\begin{align*} \mathbb{V}(X)&=\mathbb{E}(X^2)-(\mathbb{E}(X))^2\\ &=1-(0)^2\\ &=1 \end{align*}$$

Therefore, as we would expect, the mean and variance of a standard normal variable $X$ is:

$$\begin{align*} \mathbb{E}(X)=0\\ \mathbb{V}(X)=1\\ \end{align*}$$

Properties of moment-generating functions

Theorem.

Moment-generating function of random variable X+a

Let $X$ be any random variable and $a$ be some scalar. The moment-generating function of $X+a$ can be expressed as:

$$M_{X+a}(t)=e^{at}M_X(t)$$

Proof. We can use the definition of moment generating functions to prove this:

$$\begin{align*} M_{X+a}(t)&=\mathbb{E}(e^{t(x+a)})\\ &=\mathbb{E}(e^{tx+ta})\\ &=\mathbb{E}(e^{tx}e^{ta})\\ &=e^{ta}\mathbb{E}(e^{tx})\\ &=e^{ta}M_X(t) \end{align*}$$

This completes the proof.

Theorem.

Moment-generating function of random variable aX

Let $X$ be any random variable and $a$ be some scalar. The moment-generating function of $aX$ can be expressed as:

$$M_{aX}(t)=M_X(at)$$

Proof. We can use the definition of moment generating functions to prove this:

$$\begin{align*} M_{aX}(t)&=\mathbb{E}(e^{t(ax)})\\ &=\mathbb{E}(e^{x(at)})\\ &=M_X(at) \end{align*}$$

This completes the proof.

Theorem.

Moment-generating function of a sum of random variables

If $X_1$$X_2$, $\cdots$, $X_n$ are independent random variables with moment-generating function $M_{X_1}(t)$ $M_{X_2}(t)$, $\cdots$, and $M_{X_n}(t)$ respectively and $Y=X_1+X_2+⋯+X_n$, then we have that:

$$\begin{align*} M_Y(t)=M_{X_1}(t)M_{X_2}(t)\cdots{M_n(t)} \end{align*}$$

Proof. Once again, we use the definition of moment-generating functions:

$$\begin{align*} M_Y(t)&=M_{X_1+X_2+\cdots+X_n}(t)\\ &=\mathbb{E}(e^{t(x_1+x_2+\cdots+x_n)})\\ &=\mathbb{E}(e^{tx_1+tx_2+\cdots+tx_n})\\ &=\mathbb{E}(e^{tx_1}e^{tx_2}\cdots{e^{tx_n}}) \end{align*}$$

Since $X_1$$X_2$, $\cdots$$X_n$ are independent:

$$\begin{align*} M_Y(t)&= \mathbb{E}(e^{tx_1}) \mathbb{E}(e^{tx_2}) \cdots \mathbb{E}(e^{tx_n})\\ &=M_{X_t}(t)M_{X_2}(t)\cdots{M_{X_n}(t)} \end{align*}$$

This completes the proof.

Theorem.

Uniqueness Theorem

Let $X$ and $Y$ be two random variables with moment-generating functions $M_X(t)$ and $M_Y(t)$ respectively. If $M_X(t)=M_Y(t)$ for all values of $t$, then $X$ and $Y$ have the same probability distribution. In other words, if the moment-generating function of two random variables are the same, then they must have the same probability distribution.

The proof of the uniqueness theorem is quite complex and requires graduate-level mathematics, so we will omit the proof here 😞. The uniqueness theorem is one of most important properties of the moment-generating function and is used to prove theorems such as the reproductive property of the normal distribution (TODO)!

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...