search
Search
Login
Math ML Join our weekly DS/ML newsletter
menu
menu search toc more_vert
Robocat
Guest 0reps
Thanks for the thanks!
close
Outline
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook

Bonferroni's inequality

Probability and Statistics
chevron_right
Probability Theory
schedule Jul 1, 2022
Last updated
local_offer Arthur
Tags
tocTable of Contents
expand_more

Bonferroni's inequality allows us to define a lower bound for a finite intersection of events.

It states that:

$$\mathbb{P}(A_1\,\cap\,A_2) \ge \mathbb{P}(A_1) + \mathbb{P}(A_2) -1$$

Generalizing to the normal case:

$$\mathbb{P} (A_1\,\cap\,A_2\,\cap\,\cdots\,\cap\,A_n) \ge \mathbb{P}(A_1) + \mathbb{P}(A_2) + \,\cdots\,+\mathbb{P}(A_n) - (n-1)$$

Proof

From De Morgan's laws we know that:

$$(A_1 \, \cap\, A_2)^C = {A_1}^C\,\cup{A_2}^C$$

As the two events are equal, their probabilities will also be equal:

$$\mathbb{P}((A_1\,\cap\,A_2)^C)=\mathbb{P}({A_1}^C\,\cup{A_2}^C)$$

The union bound states that $\mathbb{P}(A_1\,\cup\,A_2) \le \mathbb{P}(A_1) + \mathbb{P}(A_2)$, hence applying it to the above we can rewrite as:

$$\begin{align*} \mathbb{P}((A_1\,\cap\,A_2)^C)&=\mathbb{P}({A_1}^C\,\cup{A_2}^C) \le \mathbb{P}({A_1}^C) + \mathbb{P}({A_2}^C) \\ \mathbb{P}((A_1\,\cap\,A_2)^C) &\le\mathbb{P}({A_1}^C) + \mathbb{P}({A_2}^C) \end{align*}$$

Another way to represent $\mathbb{P}((A_1\,\cap\,A_2)^C)$ is simply $1 -\mathbb{P}(A_1\,\cap\,A_2)$. Doing the likewise for the right hand side also, this gives us:

$$\begin{align*} 1-\mathbb{P}(A_1\,\cap\,A_2) &\le 1\,-\,\mathbb{P}(A_1) + 1-\mathbb{P}(A_2) \\ \mathbb{P}(A_1 \,\cap\,A_2) &\ge \mathbb{P}(A_1) + \mathbb{P}(A_2) -1 \end{align*}$$

Example

Given the following information on sports played by students in a class:

Sport

Number of students

Probability

Football only

2

2/25 = 0.08

Baseball only

3

3/25 = 0.12

Both Football and Baseball

19

19/25 = 0.76

Other

1

1/25 = 0.04

We can say that:

Students playing Football = "Football only" + "Both Football and Baseball" = 21

Students playing Baseball = "Baseball only" + "Both Football and Baseball" = 22

Hence:

$$\mathbb{P}(Football) = \frac{21}{25} = 0.84$$
$$\mathbb{P}(Baseball) = \frac{22}{25} = 0.88$$

Referring back to Bonferroni's inequality:

$$\begin{align*} \mathbb{P}(\text{Football}\,\cap\,\text{Basebal}l)&\ge\mathbb{P}(\text{Football})+\mathbb{P}(\text{Baseball})-(2-1)\\ 0.76 &\ge 0.84 + 0.88 -1 \\ 0.76 &\ge 0.72 \end{align*}$$

We can see that it indeed does hold true here.

mail
Join our newsletter for updates on new DS/ML comprehensive guides (spam-free)
robocat
Published by Arthur Yanagisawa
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!