- Find and interpret joint and marginal probabilities.
- Find and interpret conditional probabilities.
- Use the multiplication rule to find the probability of two events occurring together.
Last section, we examined the following contingency table.
|
|
Inoculated |
|||
|
|
yes |
no |
total |
|
|
Result |
lived |
0.0382 |
0.8252 |
0.8634 |
|
died |
0.0010 |
0.1356 |
0.1366 |
|
|
total |
0.0392 |
0.9608 |
1.0000 |
|
What can we learn about the result of smallpox if we already know something about inoculation status?
Conditional probability: the probability of some event \(A\) if we know that event \(B\) occurred (or is true): \[P(A|B) = \frac{P(A\text{ and }B)}{P(B)}\] where the symbol | is read as “given”.
If knowing whether event \(B\) occurs tells us nothing about event \(A\), the events are independent. For example, if we know that the first flip of a (fair) coin came up heads, that doesn’t tell us anything about what will happen next time we flip that coin.
If \(A\) and \(B\) are independent events, then \[P(A \text{ and }B) = P(A)P(B).\]
Find the probability of rolling a \(6\) on your first roll of a die and a \(6\) on your second roll.
Let \(A=\) (rolling a \(6\) on first roll) and \(B=\) (rolling a \(6\) on second roll). For each roll, the probabiltiy of getting a \(6\) is \(1/6\), so \(P(A) = \frac{1}{6}\) and \(P(B) = \frac{1}{6}\).
Then, because each roll is independent of any other rolls, \[P(A \text{ and }B) = P(A)P(B) = \frac{1}{6}\times\frac{1}{6} = \frac{1}{36}\]
If \(A\) and \(B\) are any two events, then \[P(A \text{ and }B) = P(A|B)P(B).\]
Suppose we know that 45.5% of US households have dogs and that among those with dogs, 12.1% have cats. Find the probability that a US household has both dogs and cats.
We can put our idea of independence together with our multiplication rules to come up with three ways to test for independence:
Suppose we know that 45.5% of US households have dogs and that among those with dogs, 12.1% have cats. Additionally, 32.1% of US households have cats. For US households, are having cats and having dogs independent events?
We observed in the last section that we can get a marginal probability by adding up all of its associated joint probabilities in a contingency table.
That is, \[ P(\text{lived}) = P(\text{lived and inoculated}) + P(\text{lived and not inoculated}) \]
We can generalize this for some event \(A\) and some event \(B\) whose \(k\) possible outcomes can be listed as (mutually exclusive) events \(B_1, B_2, \dots B_k\).
Then \[P(A) = P(A \text{ and }B_1) + P(A \text{ and }B_2) + \dots + P(A \text{ and }B_k) \] Using our General Multiplication Rule, we have \(P(A \text{ and }B) = P(A|B)P(B)\), so we can rewrite this as \[P(A) = P(A|B_1)P(B_1) + P(A|B_2)P(B_2) + \dots + P(A|B_k)P(B_k) \]
Sometimes, we get some additional information about a probability and we want to update our understanding based on this new information. This the basic idea behind Bayes’ Theorem.
If \(A\) and \(B\) are any two events such that \(P(B)\ne0\), then \[P(A|B) = \frac{P(B|A)P(A)}{P(B)}\]
Consider a test for some rare disease. If you have the disease, this test accurately identifies it 99% of the time. Suppose 0.5% of the population has this disease and the test results in a positive result 8% of the time. Given a positive test result, what’s the probability a person actually has the disease?