Math Home
Probability

Events \(A\) and \(B\) are independent if \[P(A \cap B) = P(A) \cdot P(B)\]

Example

Suppose we flip a fair sided coin twice. Let \(A\) be the event that the first flip is a head. Let \(B\) be the event that the second flip is a head. Let \(C\) be the event that the second flip is a tail.

The sample space is the set of all possible outcomes: \[\Omega = \{HH, HT, TH, TT\}\] The measurable events are all subsets of \(\Omega,\) \(\mathcal{P}(\Omega).\) The probability is defined by giving weight \(\frac{1}{4}\) to each point in \(\Omega.\)

We can write the events as subsets of \(\Omega\) and compute their probabilities. \begin{align} & A = \{HH, HT\} \\ & B = \{HH, TH\} \\ & C = \{HT, TT\} \\ \end{align} To compute the probability of \(A,\) we can use the fact that \(A = \{HH\} \cup \{HT\}\) and that \(\{HH\} \cap \{HT\} = \emptyset.\) \begin{align} P(A) & = P(\{HH, HT\}) \\ & = P(\{HH\}) + P(\{HT\}) \\ & = \frac{1}{4} + \frac{1}{4} \\ & = \frac{1}{2} \end{align} Similarly, \(P(B) = \frac{1}{2}\) and \(P(C) = \frac{1}{2}.\)

Now we can use the probabilities to determine which of the events are independent. \begin{align} & P(A) \cdot P(B) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4} \\ & P(A \cap B) = P(\{HH, HT\} \cap \{HH, TH\}) = P(\{HH\}) = \frac{1}{4} \\ \end{align} So, \(A\) and \(B\) are independent events. \begin{align} & P(A) \cdot P(C) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4} \\ & P(A \cap C) = P(\{HH, HT\} \cap \{HT, TT\}) = P(\{HT\}) = \frac{1}{4} \\ \end{align} So, \(A\) and \(C\) are independent events. \begin{align} & P(B) \cdot P(C) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4} \\ & P(B \cap C) = P(\{HH, TH\} \cap \{HT, TT\}) = P(\emptyset) = 0 \\ \end{align} So, \(B\) and \(C\) are not independent events.


Picture

There are often misconceptions about independent events, so we give a visualization here.

Independent events \(A\) and \(B:\) A B
The events \(A\) and \(B\) are indepenent because \(A\) takes up \(1/3\) of \(\Omega\) and \(A\) also takes up \(1/3\) of \(B.\) Similarly, \(B\) takes up \(1/2\) of \(\Omega\) and \(B\) takes up \(1/2\) of \(A.\)

See each picture in turn:

\(A\) is \(1/3\) of \(\Omega:\)

A

\(A\) takes up \(1/3\) of the blue set \(B:\)

A

\(B\) is \(1/2\) of \(\Omega:\)

B

\(B\) takes up \(1/2\) of the red set \(A:\)

B

Independence and Conditioning

If \(A\) and \(B\) are independent events, then \(P(A|B) = P(A).\)

Proof:
This follows by using the definition of conditional probability, then the definition of independence. \begin{align} P(A|B) & = \frac{P(A \cap B)}{P(B)} \\ & = \frac{P(A)P(B)}{P(B)} \\ & = P(A) \end{align}

Independence and Compliments

Claim: If \(A\) and \(B\) are independent, then \(A\) and \(B^C\) are independent.

Proof:
Suppose \(A\) and \(B\) are independent. We can write \(A\) as a union of two disjoint events: \(A = (A \cap B) \cup (A \cap B^C).\) Therefore, \[P(A) = P(A \cap B)+P(A \cap B^C)\] Using the above identity, we have \begin{align} P(A \cap B^C) & = P(A) - P(A \cap B) \\ & = P(A) - P(A)P(B) \\ & = P(A)(1-P(B)) \\ & = P(A)P(B^C) \end{align} So, \(A\) and \(B^C\) satisfy the definition of independent events.

By the claim, if \(A\) and \(B\) are independent then so are \(A^C\) and \(B,\) and \(A^C\) and \(B^C.\)

Independence of Collections of Events

A collection of events \((A_i : i \in I)\) for some index set \(I\) is said to be a collection of independent events if for every finite subset \(J \subset I,\) \[P\left(\bigcap_{j \in J}A_j\right) = \prod_{j \in J} P(A_j)\]

Check your understanding:

1. Events \(A\) and \(B\) are independent. If \(P(A) = 0.4\) and \(P(B)=0.3,\) find \(P(A \cap B).\)




Unanswered

2. Events \(A\) and \(B\) are independent. If \(P(A) = 0.6\) and \(P(B)=0.5,\) find \(P(A|B).\)




Unanswered

3. Events \(A\) and \(B\) are independent. If \(P(A) = 0.5\) and \(P(B)=0.7,\) find \(P(A \cup B).\)




Unanswered

4. A die is rolled twice in a row. Find the probability that the first roll is 3 or higher, and the second roll is 5 or lower.




Unanswered