PDF version
PDF & CDF
The probability density function is $ $f (x; \mu, \sigma) = {1\over\sqrt{2\pi}\sigma}e^{-{1\over2}{(X-\MU) ^2\over\sigma^2 }}$$ the cumulative distribution function is a defined by $ $F (x; \mu, \sigma) = \phi\left ({x-\mu\over\sigma}\right) $$ where $$\phi (z) = {1\over\sqrt{2\pi}} \int_{-\infty}^{z}e^{-{1\over2}x^2}\ dx$$
Proof:
$$ \begin{align*} \int_{-\infty}^{\infty}f (x; \mu, \sigma) &= \int_{-\infty}^{\infty}{1\over\sqrt{2\pi}\sigma}e^ {-{1\over2}{(X-\MU) ^2\over\sigma^2}}\ dx\\ &= {1\over\sqrt{2\pi}\sigma}\int_{-\infty}^{\infty}e^{-{1\over2}{( X-\MU) ^2\over\sigma^2}}\ dx\\ &= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-{1\over2}y^2}\ dy\quad\quad\ Quad\quad\quad (\mbox{setting}\ y={x-\mu\over\sigma} \rightarrow dx = \sigma dy) \ \end{align*} $$ let $I = \int_{-\infty}^ {\infty}e^{-{1\over2}y^2}\ dy$, then $$ \begin{eqnarray*} i^2 &=& \int_{-\infty}^{\infty}e^{-{1\over2}y^2}\ dy \int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ dx\\ &=& \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}e^{-{1\ Over2} (y^2+x^2)}\ Dydx\quad\quad\quad\quad (\mbox{setting}\ x=r\cos\theta, y=r\sin\theta) \ \ &=& \int_{0}^{\ infty}\int_{0}^{2\pi}e^{-{1\over2}r^2}\ Rd\theta Dr \ & & (\mbox{double integral}\ \iint\limits_{d}f (x, y) \ Dxdy = \iint\limits_{d^*}f (R\cos\theta, R\sin\theta) r\ Drd\theta) \ \ &=&Amp 2\pi\int_{0}^{\infty}re^{-{1\over2}r^2}\ dr\\ &=& -2\pi e^{-{1\over2}r^2}\big|_{0}^{\infty}\\ &=& 2\ Pi \end{eqnarray*} $$ Hence $$\int_{-\infty}^{\infty}f (x; \mu, \sigma) = {1\over\sqrt{2\pi}} \cdot\sqrt{2\pi} = 1$$
Standard Normal Distribution
If $X $ is normally distributed with parameters $\mu$ and $\sigma^2$, then $ $Z = {x-\mu\over\sigma}$$ is normally distribut Ed with parameters 0 and 1.
Proof:
An important conclusion is the IF $X $ is normally distributed with parameters $\mu$ and $\sigma^2$, then $Y = AX + b$ are Normally distributed with parameters $a \mu + b$ and $a ^2\sigma^2$. Denote $F _{y}$ as the cumulative distribution function of $Y $: $$ \begin{align*} F_{y} (x) &= p (Y \leq x) \ &= p (aX + b \leq x) \ &= P (x \leq {x-b\over a}) \ \ &= f_{x}\left ({x-b\over a}\right) \end{align*} $$ where $F _{x} (x) $ is T He cumulative distribution function of $X $. By differentiation, the probability density function of $Y $ is $$ \begin{align*} f_{y} (x) &= {1\over A}f_{x}\left ({x-b \over a}\right) \ &= {1\over\sqrt{2\pi}a\sigma}e^{-{1\over2}{({X-b\over A}-\mu) ^2\over \sigma^2}}\\ &= {1\ OVER\SQRT{2\PI} (A\sigma)}e^{-{1\over2}{(X-b-a\mu) ^2\over a^2\sigma^2}}\\ &= {1\over\sqrt{2\pi} (A\sigma)}e^{-{ 1\over2}{(x (b + a\mu)) ^2\over (A\sigma) ^2}} \end{align*} $$ which shows that $Y $ was normally distributed with parameters $a \mu + b$ and $a ^2\sigma^2$. According to the ABove result, we can easily deduce that $Z = {x-\mu\over\sigma}$ follows the normally distributed with parameters 0 and 1.
Mean
The expected value is $ $E [X] = \mu$$
Proof:
$$ \begin{align*} e[z] &= \int_{-\infty}^{\infty}xf_{z} (x) \ Dx\quad\quad\quad \quad\quad \quad\quad (\mbox{setting }\ z={x-\mu\over\sigma}) \ \ &= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}xe^{-{1\over2}x^2}\ dx\\ &=-{1\over \sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ d\left (-{1\over2}x^2\right) \ &=-{1\over\sqrt{2\pi}}e^ {-{1\over2}x^2}\big|_{-\infty}^{\infty}\\ &= 0 \end{align*} $$ Hence $$ \begin{align*} e[x] &= E\left[\sigma Z+\m u\right]\\ &= \sigma e[z] + \mu\\ &= \mu \end{align*} $$
Variance
The variance is $$\mbox{var} (X) = \sigma^2$$
Proof:
$$ \begin{align*} E\left[z^2\right] &= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}x^2e^{-{1\over2}x^2}\ dx\quad\ Quad\quad \quad\quad \quad\quad\quad\quad\quad (\mbox{setting}\ z={x-\mu\over\sigma}) \ &= {1\over\sqrt{2\pi}}\ Left (-xe^{-{1\over2}x^2}\big|_{-\infty}^{\infty} +\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ dx\right) \quad\quad\ Quad (\mbox{integrating by parts}) \ &= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ dx \quad\ Quad\quad\quad\quad\quad\quad (\mbox{standard normal distribution}) \ &= 1 \end{align*} $$ the integral by parts: $ $u = x,\ dv = xe^{-{1\over2}x^2}\ dx$$ $$\implies du = dx,\ v = \int xe^{-{1\over2}x^2}\ dx =-e^{-{1\over2}x^2}$$ $$\implies \int x^2e^{-{1\over2}x^2}\ DX =-xe^{-{1\over2}x^2} +\int e^{-{1\over2}x^2}\ dx$$ Hence $$\mbox{var} (x) = \mbox{Var} (\ Sigma Z + \mu) = \sigma^2\mbox{var} (z) = \sigma^2$$
Examples
1. If $X $ is a normal random variable with parameters $\mu = 3$ and $\sigma^2 = 9$, find (a) $P (2 < X <5 Data-blogge r-escaped-b= "" > 0) $; (c) $P (| x-3| > 6) $.
Solution:
(a) $$ \begin{align*} p (2 < X < 5) &= P\left ({2-3\over3} < {X-3\over 3} < {5-3\over 3}\right) \ \ &= P \left (-{1\over3} < Z < {2\over3}\right) \ &= \phi\left ({2\over3}\right)-\phi\left (-{1\over3}\right) = 0.3780661 \end{align*} $$ R Code:
(b) $$ \begin{align*} P (X > 0) &= p\left ({X-3\over3} > {0-3\over3}\right) \ &= p\left (Z > -1\right) \ &am P;= 1-\phi ( -1) = 0.8413447 \end{align*} $$ R Code:
1-pnorm (-1) # [1] 0.8413447
(c) $$ \begin{align*} P (| x-3| > 6) &= p (x > 9) + P (x <-3) \ &= p\left ({X-3\over3} > {9-3\over3}\right) + p\left ({X-3\over3} < {- 3-3\over3}\right) \ &= p (Z > 2) + P (Z <-2) \ &= 1-\phi (2) + \phi ( -2) = 0.04550026 \end{align*} $$ R Code:
2. Let $X $ is normally distributed with the standard deviation $\sigma$. Determine $P \left (| x-\mu| \geq 2\sigma\right) $. Compare with Chebyshev ' s inequality.
Solution:
$$ \begin{align*} p\left (| x-\mu| \geq 2\sigma\right) &= p\left ({x-\mu\over\sigma} \geq 2\right) + p\left ({x-\mu\over\sigma} \leq-2\right) \ \ &=2\ CDOT p\left ({x-\mu\over\sigma} \leq-2\right) = 2\phi ( -2) \end{align*} $$ R Code:
2 * Pnorm (-2) # [1] 0.04550026
By Chebyshev ' s inequality, the probability is $ $P \left (| x-\mu| \geq 2\sigma\right) \leq {1\over2^2}=0.25$$ which is a weaker estimation.
3. Let $X $ is a normally distributed random variable with expected value $\mu=5$. Assume $P (X \leq 0) = 0.1$. What is the variance of $X $?
Solution:
$$ \begin{align*} P (X \leq 0) &= p\left ({x-5\over\sigma} \leq {0-5\over\sigma}\right) \ \ &= P\left (Z \leq-{5\ove R\sigma}\right) = 0.1 \end{align*} $$ Hence by using R:
$$-{5\over\sigma} = -1.281552\rightarrow \sigma^2 = 15.22186$$
4. A normally distributed random variable $X $ satisfies $P (x \leq 0) = 0.4$ and $P (x \geq 10) = 0.1$. What's the expected value $\mu$ and the standard deviation $\sigma$?
Solution:
$ $P (x \leq 0) = 0.4\rightarrow \phi\left ({-\mu\over\sigma}\right) = 0.4$$ and $ $P (x \geq) = 0.1\rightarrow\phi\left ({10 -\mu\over \sigma}\right) = 0.9$$ Thus $$\begin{cases}{-\mu\over\sigma}=-0.2533471\\ {10-\mu\over \sigma}=1.281552 \end {Cases}\rightarrow \begin{cases}\mu = 1.650579\\ \sigma= 6.515088 \end{cases}$$ R Code:
5. Consider independent random variables $X \sim N (1, 3) $ and $Y \sim N (2, 4) $. What is $P (X + Y \leq 5) $?
Solution:
$X +y$ is still normally distributed with parameters $$\mu = \mu_1 + \mu_2 = 3$$ and $$\sigma^2 = \sigma_1^2 + \sigma_2^2 = 7$$ Hence $$ \begin{align*} P (X + Y \leq 5) &= p\left (Z \leq {5-3 \over\sqrt{7}}\right) \ \ &= \phi\left ({2 \over\ sqrt{7}}\right) = 0.7751541 \end{align*} $$ R Code:
Pnorm (2/SQRT (7)) # [1] 0.7751541
Reference
- Ross, S. (2010). A first Course in probability (8th Edition). Chapter 5. Pearson. Isbn:978-0-13-603313-4.
- Brink, D. (2010). Essentials of Statistics:exercises. Chapter 5 & 15. isbn:978-87-7681-409-0.
Basic probability distribution basic Concept of probability distributions 8:normal distribution