Math Home
Probability

Definition

The expected value of a discrete random variable is the average value of the random variable.

Let \(X\) be a discrete random variable and let \(S\) be the set of points such that \(P(X = x) > 0\) if and only if \(x \in S.\) The expected value of \(X\) is \[E[X] = \sum_{x \in S} x \cdot P(X = x)\] Using the pmf, we can write the expectation as \[E[X] = \sum_{x \in S} x p(x)\]




Example: Let \(X\) be the discrete random variable with pmf \[p(-1) = 0.5, p(4) = 0.2, p(6) = 0.3\] Then the expected value of \(X\) is \begin{align} E[X] & = -1 \cdot p(-1) + 4 \cdot p(4) + 6 \cdot p(6) \\ & = -1 \cdot 0.5 + 4 \cdot 0.2 + 6 \cdot 0.3 \\ & = -0.5 + 0.8 + 1.8 \\ & = 2.1 \end{align}




If \(X = 0,\) then \(S = \emptyset\) and the sum is empty. An empty sum is \(0,\) so \begin{align} E[X] & = \sum_{x \in \emptyset} x \cdot P(X = x) \\ & = 0 \end{align}

Alternate Definition

An alternate formula for the expected value of a discrete random variable \(X\) is \[\sum_{\omega \in \Omega} X(\omega) P(\omega)\]




Example: Roll a die that has values from \(1\) to \(6\) and square that number. What is the average value?

In this example, \(\Omega = \{1,2,3,4,5,6\}.\) For every \(\omega \in \Omega,\) \(P(\omega) = \frac{1}{6}\) and \(X(\omega) = \omega^2.\) So, the expected value is \begin{align} E[X] & = \sum_{\omega = 1}^6 X(\omega)P(\omega) \\ & = \sum_{\omega = 1}^6 \frac{\omega^2}{6} \\ & = \frac{1 + 4 + 9 + 16 + 25 + 36}{6} \\ & = \frac{91}{6} \\ & = 15.166\dots \end{align}

Pull Out Contants

Claim: For any discrete random variable \(X\) and any constand \(c,\) \[E[cX] = cE[X]\]

Proof:
Using the second definition of expectation, \begin{align} E[cX] & = \sum_{\omega \in \Omega} cX(\omega)P(\omega) \\ & = c \sum_{\omega \in \Omega} X(\omega)P(\omega) \\ & = cE[X] \end{align}




Example: A salesman is trying to make a sale. Let \(X\) be the amount of the sale that the salesman will make, and suppose \(X\) has the following distribution:
\(P(X = 0) = 0.4\)
\(P(X = 10) = 0.3\)
\(P(X = 15) = 0.2\)
\(P(X = 50) = 0.1\)
Also, the salesman works on commission. The salesman gets \(75\%\) of the money made in the sale.
What is the expected value of the amount of money the salesman will get on the sale?

We can compute the amount the salesman makes on each sale, then take the average value. That is, we can find the expected value of \(75\%\) of the sale, \(E[0.75X].\) On the other hand, it is probably easier to compute by hand if we pull out the \(75\%,\) \(0.75E[X].\) First find \(E[X]\) by direct computation: \begin{align} E[X] & = 0 \cdot 0.4 + 10 \cdot 0.3 + 15 \cdot 0.2 + 50 \cdot 0.1 \\ & = 3 + 3 + 5 \\ & = 11 \end{align} Finally, \(0.75E[X] = 0.75 \cdot 11 = 8.25.\) So, the salesman's commission is on average \(E[0.75X] = 8.25.\)

Sums of Random Variables

Claim: For any two discrete random variables \(X\) and \(Y,\) \[E[X+Y] = E[X]+E[Y]\]

Proof:
Using the second definitions of expectation, \begin{align} E[X+Y] & = \sum_{\omega \in \Omega}(X+Y)(\omega)P(\omega) \\ & = \sum_{\omega \in \Omega}(X(\omega)+Y(\omega))P(\omega) \\ & = \sum_{\omega \in \Omega}X(\omega)P(\omega)+Y(\omega)P(\omega) \\ & = \sum_{\omega \in \Omega}X(\omega)P(\omega)+\sum_{\omega \in \Omega}Y(\omega)P(\omega) \\ & = E[X]+E[Y] \end{align}




Example: Roll a die that has values from 1 to 6. Let \(X\) be the square of the roll and let \(Y\) be the value of the roll itself. What is the average value of \(X+Y\)?

The example above shows that \(E[X] = 15.166\dots.\) The expected value of \(Y\) is \[E[Y] = \frac{1+2+3+4+5+6}{6} = 3.5\] The average value of the sum is \[E[X+Y] = E[X]+E[Y] = (15.166\dots)+3.5 = 18.666\dots\]

Linearity of Random Variables

Corollary: Given two random variables \(X\) and \(Y\) and constants \(a, b,\) and \(c,\) \[E[aX + bY + c] = aE[X] + bE[Y] + c\]

Expected Value of Independent Random Variables

Claim: If \(X\) and \(Y\) are independent, then \[E[XY] = E[X]E[Y]\]

Proof:
Let \(S\) be the set of points on which \(X\) has non-zero probability and let \(T\) be the set of points on which \(Y\) has non-zero probability. Define \(R\) to be the set of points \(r = st\) for some \(s \in S\) and some \(t \in T.\) Then \(R\) is the set on which \(XY\) is non-zero, since it is exactly the set of points on which both \(X\) and \(Y\) are non-zero.

There may be multiple values of \(X\) and \(Y\) that result in the same product \(r.\) For example, if \(r = 4\) then \(X = 1\) and \(Y = 4\) results in \(XY = 4,\) but \(X = 2\) and \(Y = 2\) also results in \(XY = 4.\) For this reason, for each \(r\) we define a set \(A_r\) which is all pairs of points \((s, t)\) with \(s \in S, t \in T,\) and \(st = r.\)

Now compute starting from the expected value of \(XY.\) \begin{align} E[XY] & = \sum_{r \in R} rP(XY = r) \\ & = \sum_{r \in R} r P\left(\bigcup_{(s, t) \in A_r}\{X = s\} \cap \{Y = t\}\right) \end{align} This last equation is just writing that if \(XY = r\) then there must be some pair of numbers \((s,t)\) such that \(X = s,\) \(Y = t,\) and \(st = r.\) The union is over disjoint sets, so it can be brought out of the probability as a sum. \[\sum_{r \in R} r P\left(\bigcup_{(s, t) \in A_r}\{X = s\} \cap \{Y = t\}\right) = \sum_{r \in R} r \sum_{(s, t) \in A_r}P\left(\{X = s\} \cap \{Y = t\}\right)\] Next, we bring the \(r\) inside the sum and rewrite \(r\) using the fact that \(r = st\) for every pair \((s, t) \in A_r.\) \begin{align} \sum_{r \in R} r \sum_{(s, t) \in A_r}P\left(\{X = s\} \cap \{Y = t\}\right) & = \sum_{r \in R} \sum_{(s, t) \in A_r} rP\left(\{X = s\} \cap \{Y = t\}\right) \\ & = \sum_{r \in R} \sum_{(s, t) \in A_r} stP\left(\{X = s\} \cap \{Y = t\}\right) \end{align} Since \(X\) and \(Y\) are independent, \(P(\{X = s\} \cap \{Y = t\}) = P(\{X = s\})\cdot P(\{Y = t\})\) for every pair \((s, t).\) \begin{align} \sum_{r \in R} \sum_{(s, t) \in A_r} stP\left(\{X = s\} \cap \{Y = t\}\right) & = \sum_{r \in R} \sum_{(s, t) \in A_r} stP(\{X = s\}) P(\{Y = t\}) \\ & = \sum_{r \in R} \sum_{(s, t) \in A_r} (sP(\{X = s\})) \cdot (tP(\{Y = t\})) \end{align} The sum is over all values \(r \in R\) and all pairs \((s, t)\) which multiply to the value \(r.\) This sum ranges over all possible pairs \(s \in S\) and \(t \in T,\) so we can rewrite the sum and finish the computation. \begin{align} \sum_{r \in R} \sum_{(s, t) \in A_r} (sP(\{X = s\})) \cdot (tP(\{Y = t\})) & = \sum_{s \in S} \sum_{t \in T} (sP(\{X = s\})) \cdot (tP(\{Y = t\})) \\ & = \sum_{s \in S} sP(\{X = s\}) \sum_{t \in T} tP(\{Y = t\}) \\ & = E[X]E[Y] \end{align}

Quiz:

For questions \(1\) through \(4,\) let \(X\) and \(Y\) be independent random variables with pmf's \(p_X\) and \(p_Y\) defined as follows: \begin{align} & p_X(0) = 0.1, p_X(1) = 0.5, p_X(2) = 0.3, p_X(3) = 0.1 \\ & p_Y(-2) = 0.4, p_Y(1) = 0.2, p_Y(2) = 0.3, p_Y(3) = 0.1 \end{align}


1. Find \(E[X].\)




Unanswered

2. Find \(E[Y].\)




Unanswered

3. Find \(E[-2X+3]\)




Unanswered

4. Find \(E[XY^2]\)




Unanswered