Math Home

Definition

The expected value of a discrete random variable is the average value of the random variable.

Let XX be a discrete random variable and let SS be the set of points such that P(X=x)>0P(X=x)>0 if and only if xS.xS. The expected value of XX is E[X]=xSxP(X=x)E[X]=xSxP(X=x) Using the pmf, we can write the expectation as E[X]=xSxp(x)E[X]=xSxp(x)




Example: Let XX be the discrete random variable with pmf p(1)=0.5,p(4)=0.2,p(6)=0.3p(1)=0.5,p(4)=0.2,p(6)=0.3 Then the expected value of XX is E[X]=1p(1)+4p(4)+6p(6)=10.5+40.2+60.3=0.5+0.8+1.8=2.1E[X]=1p(1)+4p(4)+6p(6)=10.5+40.2+60.3=0.5+0.8+1.8=2.1




If X=0,X=0, then S=S= and the sum is empty. An empty sum is 0,0, so E[X]=xxP(X=x)=0E[X]=xxP(X=x)=0

Alternate Definition

An alternate formula for the expected value of a discrete random variable XX is ωΩX(ω)P(ω)ωΩX(ω)P(ω)




Example: Roll a die that has values from 11 to 66 and square that number. What is the average value?

In this example, Ω={1,2,3,4,5,6}.Ω={1,2,3,4,5,6}. For every ωΩ,ωΩ, P(ω)=16P(ω)=16 and X(ω)=ω2.X(ω)=ω2. So, the expected value is E[X]=6ω=1X(ω)P(ω)=6ω=1ω26=1+4+9+16+25+366=916=15.166E[X]=6ω=1X(ω)P(ω)=6ω=1ω26=1+4+9+16+25+366=916=15.166

Pull Out Contants

Claim: For any discrete random variable XX and any constand c,c, E[cX]=cE[X]E[cX]=cE[X]

Proof:
Using the second definition of expectation, E[cX]=ωΩcX(ω)P(ω)=cωΩX(ω)P(ω)=cE[X]E[cX]=ωΩcX(ω)P(ω)=cωΩX(ω)P(ω)=cE[X]




Example: A salesman is trying to make a sale. Let XX be the amount of the sale that the salesman will make, and suppose XX has the following distribution:
P(X=0)=0.4P(X=0)=0.4
P(X=10)=0.3P(X=10)=0.3
P(X=15)=0.2P(X=15)=0.2
P(X=50)=0.1P(X=50)=0.1
Also, the salesman works on commission. The salesman gets 75%75% of the money made in the sale.
What is the expected value of the amount of money the salesman will get on the sale?

We can compute the amount the salesman makes on each sale, then take the average value. That is, we can find the expected value of 75%75% of the sale, E[0.75X].E[0.75X]. On the other hand, it is probably easier to compute by hand if we pull out the 75%,75%, 0.75E[X].0.75E[X]. First find E[X]E[X] by direct computation: E[X]=00.4+100.3+150.2+500.1=3+3+5=11E[X]=00.4+100.3+150.2+500.1=3+3+5=11 Finally, 0.75E[X]=0.7511=8.25.0.75E[X]=0.7511=8.25. So, the salesman's commission is on average E[0.75X]=8.25.E[0.75X]=8.25.

Sums of Random Variables

Claim: For any two discrete random variables XX and Y,Y, E[X+Y]=E[X]+E[Y]E[X+Y]=E[X]+E[Y]

Proof:
Using the second definitions of expectation, E[X+Y]=ωΩ(X+Y)(ω)P(ω)=ωΩ(X(ω)+Y(ω))P(ω)=ωΩX(ω)P(ω)+Y(ω)P(ω)=ωΩX(ω)P(ω)+ωΩY(ω)P(ω)=E[X]+E[Y]E[X+Y]=ωΩ(X+Y)(ω)P(ω)=ωΩ(X(ω)+Y(ω))P(ω)=ωΩX(ω)P(ω)+Y(ω)P(ω)=ωΩX(ω)P(ω)+ωΩY(ω)P(ω)=E[X]+E[Y]




Example: Roll a die that has values from 1 to 6. Let X be the square of the roll and let Y be the value of the roll itself. What is the average value of X+Y?

The example above shows that E[X]=15.166. The expected value of Y is E[Y]=1+2+3+4+5+66=3.5 The average value of the sum is E[X+Y]=E[X]+E[Y]=(15.166)+3.5=18.666

Linearity of Random Variables

Corollary: Given two random variables X and Y and constants a,b, and c, E[aX+bY+c]=aE[X]+bE[Y]+c

Expected Value of Independent Random Variables

Claim: If X and Y are independent, then E[XY]=E[X]E[Y]

Proof:
Let S be the set of points on which X has non-zero probability and let T be the set of points on which Y has non-zero probability. Define R to be the set of points r=st for some sS and some tT. Then R is the set on which XY is non-zero, since it is exactly the set of points on which both X and Y are non-zero.

There may be multiple values of X and Y that result in the same product r. For example, if r=4 then X=1 and Y=4 results in XY=4, but X=2 and Y=2 also results in XY=4. For this reason, for each r we define a set Ar which is all pairs of points (s,t) with sS,tT, and st=r.

Now compute starting from the expected value of XY. E[XY]=rRrP(XY=r)=rRrP((s,t)Ar{X=s}{Y=t}) This last equation is just writing that if XY=r then there must be some pair of numbers (s,t) such that X=s, Y=t, and st=r. The union is over disjoint sets, so it can be brought out of the probability as a sum. rRrP((s,t)Ar{X=s}{Y=t})=rRr(s,t)ArP({X=s}{Y=t}) Next, we bring the r inside the sum and rewrite r using the fact that r=st for every pair (s,t)Ar. rRr(s,t)ArP({X=s}{Y=t})=rR(s,t)ArrP({X=s}{Y=t})=rR(s,t)ArstP({X=s}{Y=t}) Since X and Y are independent, P({X=s}{Y=t})=P({X=s})P({Y=t}) for every pair (s,t). rR(s,t)ArstP({X=s}{Y=t})=rR(s,t)ArstP({X=s})P({Y=t})=rR(s,t)Ar(sP({X=s}))(tP({Y=t})) The sum is over all values rR and all pairs (s,t) which multiply to the value r. This sum ranges over all possible pairs sS and tT, so we can rewrite the sum and finish the computation. rR(s,t)Ar(sP({X=s}))(tP({Y=t}))=sStT(sP({X=s}))(tP({Y=t}))=sSsP({X=s})tTtP({Y=t})=E[X]E[Y]

Quiz:

For questions 1 through 4, let X and Y be independent random variables with pmf's pX and pY defined as follows: pX(0)=0.1,pX(1)=0.5,pX(2)=0.3,pX(3)=0.1pY(2)=0.4,pY(1)=0.2,pY(2)=0.3,pY(3)=0.1


1. Find E[X].




Unanswered

2. Find E[Y].




Unanswered

3. Find E[2X+3]




Unanswered

4. Find E[XY2]




Unanswered