Examples of conditional probability in the following topics:
-
- We call this a conditional probability because we computed the probability under a condition: parents = used.
- There are two parts to a conditional probability, the outcome of interest and the condition.
- Thus, the conditional probability could be computed:
- The fraction of these probabilities represents our general formula for conditional probability.
- The conditional probability of the outcome of interest A given condition B is computed as the following:
-
- The conditional probability of an event is the probability that an event will occur given that another event has occurred.
- The notation $P(B|A)$ indicates a conditional probability, meaning it indicates the probability of one event under the condition that we know another event has happened.
- The conditional probability $\displaystyle P(B|A)$ of an event $B$, given an event $A$, is defined by:
- The conditional probability $P(B|A)$ is not always equal to the unconditional probability $P(B)$.
- Mathematically, Bayes' theorem gives the relationship between the probabilities of $A$ and $B$, $P(A)$ and $P(B)$, and the conditional probabilities of $A$ given $B$ and $B$ given $A$.
-
- We can show this is mathematically true using conditional probabilities.
- (a) What is the probability that the first die, X, is 1?
- (b) What is the probability that both X and Y are 1?
- (c) Use the formula for conditional probability to compute P(Y = 1 |X = 1).
- (d) The probability is the same as in part (c): P(Y = 1) = 1/6.
-
- The multiplication rule states that the probability that $A$ and $B$ both occur is equal to the probability that $B$ occurs times the conditional probability that $A$ occurs given that $B$ occurs.
- In probability theory, the Multiplication Rule states that the probability that $A$ and $B$ occur is equal to the probability that $A$ occurs times the conditional probability that $B$ occurs, given that we know $A$ has already occurred.
- We obtain the general multiplication rule by multiplying both sides of the definition of conditional probability by the denominator.
- The probability that we get a $2$ on the die and a tails on the coin is $\frac{1}{6}\cdot \frac{1}{2} = \frac{1}{12}$, since the two events are independent.
- Apply the multiplication rule to calculate the probability of both $A$ and $B$ occurring
-
- In many instances, we are given a conditional probability of the form:
- But we would really like to know the inverted conditional probability:
- Tree diagrams can be used to find the second conditional probability when given the first.
- In the example, each of the probabilities on the right side was broken down into a product of a conditional probability and marginal probability using the tree diagram.
- Consider the following conditional probability for variable 1 and variable 2:
-
- Tree diagrams are annotated with marginal and conditional probabilities, as shown in Figure 2.17.
- The secondary branches are conditioned on the first, so we assign conditional probabilities to these branches.
- For example, the top branch in Figure 2.17 is the probability that result = lived conditioned on the information that inoculated = yes.
- To calculate this conditional probability, we need the following probabilities:
- The final grades, which correspond to the conditional probabilities provided, will be shown on the secondary branches.
-
- The probability distribution of a discrete random variable $x$ lists the values and their probabilities, where value $x_1$ has probability $p_1$, value $x_2$ has probability $x_2$, and so on.
- Every probability $p_i$ is a number between 0 and 1, and the sum of all the probabilities is equal to 1.
- Of the conditional probabilities of the event $B$ given that $A_1$ is the case or that $A_2$ is the case, respectively.
- The formula, table, and probability histogram satisfy the following necessary conditions of discrete probability distributions:
- The probability mass function has the same purpose as the probability histogram, and displays specific probabilities for each discrete random variable.
-
- These totals represent marginal probabilities for the sample, which are the probabilities based on a single variable without conditioning on any other variables.
- For instance, a probability based solely on the student variable is a marginal probability:
- If a probability is based on a single variable, it is a marginal probability.
- Verify Table 2.14 represents a probability distribution: events are disjoint, all probabilities are non-negative, and the probabilities sum to 1.24.
- We can compute marginal probabilities using joint probabilities in simple cases.
-
- Experimental probability contrasts theoretical probability, which is what we would expect to happen.
- In statistical terms, the empirical probability is an estimate of a probability.
- For example, consider estimating the probability among a population of men that satisfy two conditions:
- A direct estimate could be found by counting the number of men who satisfy both conditions to give the empirical probability of the combined condition.
- An alternative estimate could be found by multiplying the proportion of men who are over six feet in height with the proportion of men who prefer strawberry jam to raspberry jam, but this estimate relies on the assumption that the two conditions are statistically independent.
-
- The hypergeometric distribution is a discrete probability distribution that describes the probability of $k$ successes in $n$ draws without replacement from a finite population of size $N$ containing a maximum of $K$ successes.
- The following conditions characterize the hypergeometric distribution:
- In the softball example, the probability of picking a women first is $\frac{13}{24}$.
- The probability of picking a man second is $\frac{11}{23}$, if a woman was picked first.
- The probability of the second pick depends on what happened in the first pick.