Math Home
Probability

Definition

A continuous random variable \(X\) has the normal distribution with mean \(\mu\) and variance \(\sigma^2\) if the pdf of \(X\) is \[f(x) = \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}\]

We say \(X\) has the Normal\((\mu, \sigma^2)\) distribution.

There is no closed from cdf for a normal distribution.



Claim: The function \(f(x) = \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}\) is a pdf.
Proof:
Since \(f(x) > 0\) for all \(x,\) the only thing that needs to be shown is that \(\int_{-\infty}^\infty f(x) dx = 1.\) First, substitute \(x - \mu\) for \(x\) to eliminate the \(\mu\) in the exponent. \[\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{\frac{-(x-\mu)^2}{2\sigma^2}} dx = \int_{-\infty}^\infty \frac{1}{\sqrt{2\sigma^2}}e^{\frac{-x^2}{2\sigma^2}} dx\] It is easier to compute the square of the integral rather than the integral itself. \begin{align} \left( \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{x^2}{2\sigma^2}} dx\right)^2 & = \left( \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{x^2}{2\sigma^2}} dx\right)\left( \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{x^2}{2\sigma^2}} dx\right) \\ & = \left( \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{x^2}{2\sigma^2}} dx\right)\left( \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}} dy\right) \\ & = \frac{1}{2\pi\sigma^2}\int_{-\infty}^\infty \int_{-\infty}^\infty e^{-\frac{x^2}{2\sigma^2}}e^{-\frac{y^2}{2\sigma^2}} dxdy \\ & = \frac{1}{2\pi\sigma^2}\int_{-\infty}^\infty \int_{-\infty}^\infty e^{-\frac{x^2+y^2}{2\sigma^2}} dxdy \\ \end{align} Now use polar coordinates. Substitute \(r^2\) for \(x^2+y^2\) and \(rdrd\theta\) for \(dxdy.\) \begin{align} \frac{1}{2\pi\sigma^2}\int_{-\infty}^\infty \int_{-\infty}^\infty e^{-\frac{x^2+y^2}{2\sigma^2}} dxdy & = \frac{1}{2\pi\sigma^2}\int_0^{2\pi} \int_0^\infty e^{-\frac{r^2}{2\sigma^2}} rdrd\theta \\ & = \frac{1}{2\pi\sigma^2}\int_0^{2\pi} \int_0^\infty re^{-\frac{r^2}{2\sigma^2}} drd\theta \end{align} Substitute \(u = r^2,\) so \(du = 2rdr.\) \begin{align} \frac{1}{2\pi\sigma^2}\int_0^{2\pi} \int_0^\infty re^{\frac{-r^2}{2\sigma^2}} drd\theta & = \frac{1}{2\pi\sigma^2}\int_0^{2\pi} \int_0^\infty \frac{1}{2}e^{\frac{-u}{2\sigma^2}} dud\theta \\ & = \frac{1}{2\pi\sigma^2}\int_0^{2\pi} \left.\frac{-2\sigma^2}{2}e^{-\frac{u}{2\sigma^2}}\right|_{u=0}^\infty d\theta \\ & = \frac{1}{2\pi\sigma^2}\int_0^{2\pi} \sigma^2 d\theta \\ & = \frac{1}{2\pi\sigma^2}2\pi\sigma^2 \\ & = 1 \end{align} Since \((\int_{-\infty}^\infty f(x) dx)^2 = 1\) and \(f(x) > 0,\) \(\int_{-\infty}^\infty f(x) = 1.\)



Claim: The expected value of \(X\) is \(E[X] = \mu.\)
Proof:
The result can be found by direct computation. \begin{align} E[X] & = \int_{-\infty}^\infty \frac{x}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}} dx \\ & = \int_{-\infty}^\infty \frac{y+\mu}{\sqrt{2\pi\sigma^2}}e^{\frac{-y^2}{2\sigma^2}} dy \end{align} where the last line follows by substituting \(y+\mu\) for \(x.\) Splitting the integral over the \(y+\mu\) we get \begin{align} \int_{-\infty}^\infty \frac{y+\mu}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}} dy & = \int_{-\infty}^\infty \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy + \int_{-\infty}^\infty \frac{\mu}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy \\ & = \int_{-\infty}^\infty \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy + \mu\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy \end{align} Now we compute each integral in turn. The left integral is \begin{align} \int_{-\infty}^\infty \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy & = \int_{-\infty}^0 \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy + \int_0^\infty \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy \\ & = -\int_{0}^\infty \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy + \int_0^\infty \frac{y}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy \\ & = 0 \end{align} where the second line follows from a substitution of \(-y\) for \(y.\)

The second integral is an integral of a pdf of a normal random variable, so it is equal to \(1.\) \begin{align} \mu\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{y^2}{2\sigma^2}}dy & = \mu \cdot 1 \\ & = \mu \end{align}




Claim: The variance of \(X\) is \(\text{Var}(X) = \sigma^2.\)

The proof of this fact is reserved for later.

Standard Normal

A standard normal random variable has mean \(0\) and variance \(1.\) The pdf for a standard normal random variable is \[f(x) = \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}\]

Claim: A normal random variable \(X\) with mean \(\mu\) and standard deviation \(\sigma\) can be transformed to a standard normal random variable by subtracting the mean and dividing the standard deviation: \[\frac{X - \mu}{\sigma}\]

Proof:
Let \(Y = \frac{X - \mu}{\sigma},\) and let \(F_X(t)\) and \(F_Y(t)\) be the cdf's of \(X\) and \(Y,\) respectively. By computation, we can write \(F_Y(t)\) in terms of \(F_X(t).\) \begin{align} F_Y(t) & = P\left(\frac{X-\mu}{\sigma} \leq t\right) \\ & = P(X \leq \sigma t + \mu) \\ & = F_X(\sigma t + \mu) \end{align} Take the derivative of both sides to convert the cdf's to pdf's. \begin{align} & \frac{d}{dt}F_Y(t) = \frac{d}{dt}F_X(\sigma t + \mu) \Rightarrow & f_Y(t) = \sigma f_X(\sigma t + \mu) \end{align} So, the pdf of \(Y\) is \begin{align} f_Y(t) & = \sigma \cdot \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{((\sigma t + \mu) - \mu)^2}{2\sigma^2}} \\ & = \frac{1}{\sqrt{2\pi}}e^{-\frac{t^2}{2}} \end{align} which shows \(Y,\) or \(\frac{X-\mu}{\sigma},\) has the standard normal distribution.

Bell Curve

Normal random variables take the shape of a bell curve. The highest point on the curve is the mean. The curve is always symmetric about the mean. The curve will be narrow or wide depending on the standard deviation.

This graph shows the pdf of a standard normal random variable.

This graph shows the pdf of a Normal\((0,2)\) random variable. The standard normal is in gray for reference. The higher variance makes the graph wider and flatter.

This graph shows the pdf of a Normal\((1,1)\) random variable. The standard normal is in gray for reference. Raising the mean shifts the graph to the right.

Standard Normal Table

You can refer to the standard normal table when finding probabilities of normally distributed random variables.

Standard Normal Table:


Example: Let \(X\) be normally distributed with mean \(3\) and standard deviation \(2.\) Find the following:

  1. \(P(X < 4)\)
  2. \(P(X > 2)?\)
  3. \(P(X < 0)\)

Solution: For each case, transform \(X\) into a standard normal random variable by subtracting the mean and dividing by the standard deviation. Let \(Z\) be a standard normal random variable.

  1. First, standardize the random variable. \begin{align} P(X < 4) & = P\left(\frac{X-3}{2} < 0.5\right) \\ & = P(Z < 0.5) \end{align} In the table, look for the \(0.5\) and the \(.00\) column, since added together this is \(0.50.\) The number found there is \(0.69146.\) Therefore, \[P(X < 4) \approx 0.69146\]
  2. Standardize the normal variable. \begin{align} P(X > 2) & = P\left(\frac{X-3}{2} > -0.5\right) \\ & = P(Z > -0.5) \end{align} By symmetry of the distribution, \(P(Z > -0.5) = P(Z \leq 0.5).\) Since \(Z\) is continuous, this is the same as \(P(Z < 0.5).\) In part \(1,\) we found \(P(Z < 0.5) \approx 0.69146.\) So, \(P(X > 2) \approx 0.69146.\)
  3. Standardize the normal variable. \begin{align} P(X < 0) & = P\left(\frac{X-3}{2} < -1.5\right) \\ & = P(Z < -1.5) \end{align} The standard normal table only shows the probability that \(Z\) is less than a positive value, so we need to rearrange the inequality.

    By symmetry, \(P(Z < -1.5) = P(Z > 1.5).\) Using the fact that \(P(A) = 1-P(A^C),\) \[P(Z > 1.5) = 1 - P(Z \leq 1.5)\] Since \(Z\) is continuous, \(P(Z \leq 1.5) = P(Z < 1.5).\) This probability can be found in row \(1.5\) and column \(.00,\) since \(1.5+.00 = 1.50.\) The number there is \(0.93319.\) Therefore, \[P(X < 0) \approx 1-0.93319 = 0.06681\]

1. Let \(Z\) be a standard normal random variable. Find \(P(Z > 0).\)




Unanswered

2. Let \(X\) have the Normal\((-2,4)\) distribution. Find \(P(X > 0).\)




Unanswered

3. Let \(X\) and \(Y\) be normally distributed with mean \(0.\) If \(Var(X) > Var(Y),\) which is true?



Unanswered

4. Let \(X\) be normally distributed with mean \(1\) and variance \(2.\) What is \(P(0 < X < 2)?\)




Unanswered