search
Search
Login
Math ML Join our weekly DS/ML newsletter
menu
menu search toc more_vert
Robocat
Guest 0reps
Thanks for the thanks!
close
chevron_left Moments
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook
chevron_left Moments
check_circle
Mark as learned
thumb_up
0
thumb_down
0
chat_bubble_outline
0
auto_stories new
settings

Moment Generating Function

Probability and Statistics
chevron_right
Moments
schedule Jul 1, 2022
Last updated
local_offer
Tags

What is moment generating function?

We can describe a random variable $X$ with numerical measures such as the mean and standard deviation. However, these numerical measures do not uniquely capture the distribution of $X$ since many different distributions can still possess the same mean and standard deviation. One way of uniquely characterizing the distribution is by using moment-generating functions.

Theorem.

Uniqueness Theorem

Let $X$ and $Y$ be two random variables with moment-generating functions $M_X(t)$ and $M_Y(t)$ respectively. If $M_X(t)=M_Y(t)$ for all values of $t$, then $X$ and $Y$ have the same probability distribution.

Definition.

Moment-generating function

The moment-generating function $M_X(t)$ for a random variable $X$ is defined to be:

$$M_X(t)=\mathbb{E}(e^{tx})$$
Theorem.

Derivative of moment-generating function

$$M_X^{(k)}(0)=\mathbb{E}(x^k)=\mu'_k$$

From Taylor's series, we know that $e^{tx}$ can be represented in the following way:

$$e^{tx}=1+tx +\frac{(tx)^2}{2!} +\frac{(tx)^3}{3!} +\cdots +\frac{(tx)^n}{n!}$$

Now, we take the expected value:

$$\begin{equation}\label{eq:E8GLN2o41NCrdpqjVLd} \begin{aligned}[b] M_X(t)&= \mathbb{E}(e^{tx})\\&= \mathbb{E}\Big(\frac{(tx)^0}{0!}+\frac{(tx)^1}{1!}+\frac{(tx)^2}{2!} +\frac{(tx)^3}{3!} +\cdots +\frac{(tx)^n}{n!}\Big)\\ &= \mathbb{E}\Big(\frac{(tx)^0}{0!}\Big) +\mathbb{E}\Big(\frac{(tx)^1}{1!}\Big) +\mathbb{E}\Big(\frac{(tx)^2}{2!}\Big) +\mathbb{E}\Big(\frac{(tx)^3}{3!}\Big) +\cdots +\mathbb{E}\Big(\frac{(tx)^n}{n!}\Big)\\ &=\frac{t^0}{0!}\mathbb{E}(x^0) +\frac{t^1}{1!}\mathbb{E}(x^1) +\frac{t^2}{2!}\mathbb{E}(x^2) +\frac{t^3}{3!}\mathbb{E}(x^3) +\cdots +\frac{t^n}{n!}\mathbb{E}(x^n)\\ &=\frac{t^0}{0!}\sum_x(x^0p(x)) +\frac{t^1}{1!}\sum_x(x^1p(x)) +\frac{t^2}{2!}\sum_x(x^2p(x)) +\frac{t^3}{3!}\sum_x(x^3p(x)) +\cdots +\frac{t^n}{n!}\sum_x(x^np(x))\\ \end{aligned} \end{equation}$$

Now we define $\mu'_i$ as:

$$\mu'_i= \mathbb{E}(x^i) =\sum_x{x^ip(x)}$$

We can now write \eqref{eq:E8GLN2o41NCrdpqjVLd} as follows:

$$\begin{align*} M_X(t)&= \frac{t^0}{0!}\mu'_0 +\frac{t^1}{1!}\mu'_1 +\frac{t^2}{2!}\mu'_2 +\frac{t^3}{3!}\mu'_3 +\cdots +\frac{t^n}{n!}\mu'_n\\ &= \sum_{i=0}^n\frac{t^i}{i!}\mu'_i \end{align*}$$

The magic happens when we start taking the derivatives of $M_X(t)$. The first derivative of $M_X(t)$ can be calculated like so:

$$\begin{align*} M_X^{(1)}(t)&= \frac{d}{dt}\mathbb{E}(e^{tx})\\ &=\frac{d}{dt}\Big( \frac{t^0}{0!}\mathbb{E}(x^0) +\frac{t^1}{1!}\mathbb{E}(x^1) +\frac{t^2}{2!}\mathbb{E}(x^2) +\frac{t^3}{3!}\mathbb{E}(x^3) +\cdots +\frac{t^n}{n!}\mathbb{E}(x^n)\\ &= \frac{t^0}{0!}\mathbb{E}(x^1) +\frac{t^1}{1!}\mathbb{E}(x^2) +\frac{t^2}{2!}\mathbb{E}(x^3) +\cdots +\frac{t^{n-1}}{n-1!}\mathbb{E}(x^n) \end{align*}$$

If we set $t=0$, then we have:

$$M^{(1)}_X(0)=\mathbb{E}(x^1)=\mu'_1$$

We get the first moment of a random variable about the origin!

Now, lets take the second derivative:

$$\begin{align*} M_X^{(2)}(t)&= \frac{d}{dt}M_X^{(1)}(t)\\ &=\frac{d}{dt}\Big(\frac{t^0}{0!}\mathbb{E}(x^1) +\frac{t^1}{1!}\mathbb{E}(x^2) +\frac{t^2}{2!}\mathbb{E}(x^3) +\cdots +\frac{t^{n-1}}{n-1!}\mathbb{E}(x^n)\Big)\\ &=\frac{t^0}{0!}\mathbb{E}(x^2) +\frac{t^1}{1!}\mathbb{E}(x^3) +\cdots +\frac{t^{n-2}}{n-2!}\mathbb{E}(x^n) \end{align*}$$

If we set $t=0$:

$$M_X^{(2)}(0)=\mathbb{E}(x^2)=\mu'_2$$

We now get the second moment of a random variable about the origin!

Here, we can easily see where the moment-generating function gets its’ name from. The moment-generating functions can generate the $k$-th moment of a random variable about the origin! For the general case, we observe the following:

$$M_X^{(k)}(0)=\mathbb{E}(x^k)=\mu'_k$$
Theorem.

Moment-generating function of random variable X+a

$$M_{X+a}(t)=e^{at}M_X(t)$$

We can use the definition of moment generating functions to prove this:

$$\begin{align*} M_{X+a}(t)&=\mathbb{E}(e^{t(x+a)})\\ &=\mathbb{E}(e^{tx+ta})\\ &=\mathbb{E}(e^{tx}e^{ta})\\ &=e^{ta}\mathbb{E}(e^{tx})\\ &=e^{ta}M_X(t) \end{align*}$$
Theorem.

Moment-generating function of random variable aX

$$M_{aX}(t)=M_X(at)$$

We can use the definition of moment generating functions to prove this:

$$\begin{align*} M_{aX}(t)&=\mathbb{E}(e^{t(ax)})\\ &=\mathbb{E}(e^{x(at)})\\ &=M_X(at) \end{align*}$$
Theorem.

Moment-generating function of a sum of random variables

If $X_1$,$X_2$,...,$X_n$ are independent random variables with moment-generating functions $M_{X_1}(t)$, $M_{X_2}(t)$, ..., $M_{X_n}(t)$ respectively, and $Y=X_1+X_2+⋯+X_n$. Then we have that:

$$\begin{align*} M_Y(t)=M_{X_1}(t)M_{X_2}(t)\cdots{M_n(t)} \end{align*}$$

Once again, we use the definition of moment-generating functions:

$$\begin{align*} M_Y(t)&=M_{X_1+X_2+\cdots+X_n}(t)\\ &=\mathbb{E}(e^{t(x_1+x_2+\cdots+x_n)})\\ &=\mathbb{E}(e^{tx_1+tx_2+\cdots+tx_n})\\ &=\mathbb{E}(e^{tx_1}e^{tx_2}\cdots{e^{tx_n}}) \end{align*}$$

Since $X_1$,$X_2$,..,$X_n$ are independent:

$$\begin{align*} M_Y(t)&= \mathbb{E}(e^{tx_1}) \mathbb{E}(e^{tx_2}) \cdots \mathbb{E}(e^{tx_n})\\ &=M_{X_t}(t)M_{X_2}(t)\cdots{M_{X_n}(t)} \end{align*}$$
mail
Join our newsletter for updates on new DS/ML comprehensive guides (spam-free)
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Ask a question or leave a feedback...