Math Home

Probability

The covariance between the random variables \(X\) and \(Y\) is \[\text{Cov}(X,Y) = E[(X-E[X])(Y-E[Y])]\] Alternatively, \[\text{Cov}(X,Y) = E[XY]-E[X]E[Y]\]

Example: A coin is flipped \(3\) times. Let \(X\) be the number of heads in the first two flips, and \(Y\) be the number of tails in the last two flips. Find the covariance between \(X\) and \(Y.\)

Solution: First, we can consider a table that represents the distributions of \(X,\) \(Y,\) and \(XY.\)

\(\Omega\)

HHH

HHT

HTH

HTT

THH

THT

TTH

TTT

HHH

HHT

HTH

HTT

THH

THT

TTH

TTT

\(X\)

2

2

1

1

1

1

0

0

2

2

1

1

1

1

0

0

\(Y\)

0

1

1

2

0

1

1

2

0

1

1

2

0

1

1

2

\(XY\)

0

2

1

2

0

1

0

0

0

2

1

2

0

1

0

0

The covariance of two random variables tells how they are related to one another. If \(X\) and \(Y\) have a positive covariance, as one of \(X\) or \(Y\) increases it is likely that the other increases. If \(X\) and \(Y\) have a negative covariance, as one of \(X\) or \(Y\) increases it is likely that the other decreases.

The following are some of the mathematical properties of covariance:

- \(\text{Cov}(X,X) = \text{Var}(X)\)
- \(\text{Cov}(X,Y) = \text{Cov}(Y,X)\)
- If \(a\) is a number, then \(\text{Cov}(aX,Y) = a\text{Cov}(X,Y).\)
- \(\text{Cov}(X,Y+Z) = \text{Cov}(X,Y)+\text{Cov}(X,Z)\)
- If \(X\) and \(Y\) are independent, \(\text{Cov}(X,Y) = 0\), but \(\text{Cov}(X,Y) = 0\) does not imply independence.

▼ Proof:

- Using the formula for covariance, \[\text{Cov}(X,X) = E[X^2] - E[X]^2 = \text{Var}(X)\]
- Both \(\text{Cov}(X,Y)\) and \(Cov(Y,X)\) are equal to \(E[XY]-E[X]E[Y].\)
- Using the formula for covariance, \[\text{Cov}(aX,Y) = E[aXY] - E[aX]E[Y]\] A constant can be pulled out from an expectation, so we get \begin{align} E[aXY] - E[aX]E[Y] & = aE[XY] - aE[X]E[Y] \\ & = a(E[XY] - E[X]E[Y]) \\ & = a\text{Cov}(X,Y) \end{align}
- By the definition of covariance, \[\text{Cov}(X,Y+Z) = E[X(Y+Z)]-E[X]E[Y+Z]\] Using the linearity of expectation where it applies, \begin{align} E[X(Y+Z)]+E[X]E[Y+Z] & = E[XY + XZ] - E[X](E[Y]+E[Z]) \\ & = E[XY]+E[XZ]-E[X]E[Y]-E[X]E[Z] \\ & = E[XY]-E[X]E[Y]+E[XZ]-E[X]E[Z] \\ & = \text{Cov}(X,Y)+\text{Cov}(X,Z) \end{align}
- If \(X\) and \(Y\) are independent, then \(E[XY] = E[X]E[Y].\) So, \[\text{Cov}(X,Y) = E[XY]-E[X]E[Y] = 0\] On the other hand, we can construct random variables \(X\) and \(Y\) which have covariance \(0\) but are not independent. Let the sample space be \(\Omega = \{1,2,3,4\}\) with probability \(\frac{1}{4}\) for each point. Define random variables \(X\) and \(Y\) by \[X(1) = -1, X(2) = 1, X(3) = 0, X(4) = 0\] \[Y(1) = 0, Y(2) = 0, Y(3) = -1, Y(4) = 1\] Then for any point \(\omega \in \Omega,\) \(XY(\omega) = 0.\) So, \(E[XY] = 0.\) Also, \(E[X] = 0\) and \(E[Y] = 0.\) Therefore, \[\text{Cov}(X,Y) = E[XY]-E[X]E[Y] = 0-0 = 0\] However, \(X\) and \(Y\) are not independent because \(P(X = 1) = \frac{1}{4}\) and \(P(Y = 1) = \frac{1}{4}.\) However, \(P(X = 1, Y = 1) = 0\) since one of \(X\) or \(Y\) is always \(0.\) Since \(P(X = 1, Y = 1) \neq P(X=1)P(Y=1),\) \(X\) and \(Y\) are not independent.

The correlation of \(X\) and \(Y\) is \[\text{Corr}(X,Y) = \frac{\text{Cov}(X,Y)}{\sigma_X \sigma_Y}\]

Property 1: The correlation of any \(2\) random variables is in \([-1, 1].\)

We will not prove this fact here, but will link to more advanced material as it becomes available.

Property 2: For any random variable \(X,\) \(\text{Corr}(X, X) = 1.\)

▼ Proof:

By definition of correlation,
\begin{align}
\text{Corr}(X,X) & = \frac{\text{Cov}(X,X)}{\sigma_X^2} \\
& = \frac{\text{Var}(X)}{\text{Var}(X)} \\
& = 1
\end{align}

Property 3: If \(Y\) is a positive multiple of \(X,\) \(\text{Corr}(X, Y) = 1.\)

▼ Proof:

If \(Y = aX\) for some \(a > 0,\) then \(\sigma_Y = a\sigma_X.\) So, by the definition of correlation,
\begin{align}
\text{Corr}(X,aX) & = \frac{\text{Cov}(X,aX)}{a\sigma_X^2} \\
& = \frac{a\text{Cov}(X,X)}{a\sigma_X^2} \\
& = \frac{a\text{Var}(X)}{a\text{Var}(X)} \\
& = 1
\end{align}

Property 4: If \(Y\) is a negative multiple of \(X,\) \(\text{Corr}(X, Y) = -1.\)

▼ Proof:

If \(Y = aX\) for some \(a < 0,\) then \(\sigma_Y = -a\sigma_X\) because standard deviation is always positive. So, by the definition of correlation,
\begin{align}
\text{Corr}(X,aX) & = \frac{\text{Cov}(X,aX)}{-a\sigma_X^2} \\
& = \frac{a\text{Cov}(X,X)}{-a\sigma_X^2} \\
& = -\frac{a\text{Var}(X)}{a\text{Var}(X)} \\
& = -1
\end{align}

Property 5: If \(X\) and \(Y\) are independent, \(\text{Corr}(X, Y) = 0.\)

▼ Proof:

If \(X\) and \(Y\) are independent, \(\text{Cov}(X,Y) = 0.\) So,
\[\text{Corr}(X,Y) = \frac{0}{\sigma_X \sigma_Y} = 0\]

Check your understanding:

For the problems, let \(\Omega = \{a,b,c,d\}\) with probability measure \(P(a) = 0.2,\) \(P(b) = 0.4,\) \(P(c) = 0.1,\) and \(P(d) = 0.3.\) The following random variables are defined on the space: \begin{align} & X(a) = 1, X(b) = 1, X(c) = 2, X(d) = 3 \\ & Y(a) = 1, Y(b) = 0, Y(c) = 1, Y(d) = 3 \\ & Z(a) = 4, Z(b) = 3, Z(c) = 1, Z(d) = 0 \end{align}

1. What is the covariance between \(X\) and \(Y?\)

Unanswered

2. What is the covariance between \(X\) and \(Z?\)

Unanswered

3. What is \(\text{Cov}(X, 2Y-Z)?\)

Unanswered

4. What is \(\text{Corr}(Y, Z)?\)

Unanswered