The Normal distribution is ubiquitous in statistics, partially because
of the central limit theorem, which states that sums of i.i.d. random
variables eventually become Normal. Linear transformations of Normal
random variables result in new random variables that are also Normal. If
you are taking an intro stats course, you'll likely use the Normal
distribution for Z-tests and in simple linear regression. Under
regularity conditions, maximum likelihood estimators are
asymptotically Normal. The Normal distribution is also called the
gaussian distribution.
We recommend reading this documentation on
https://pkg.mitchelloharawild.com/distributional/, where the math
will render nicely.
In the following, let \(X\) be a Normal random variable with mean
mu
= \(\mu\) and standard deviation sigma
= \(\sigma\).
Support: \(R\), the set of all real numbers
Mean: \(\mu\)
Variance: \(\sigma^2\)
Probability density function (p.d.f):
$$
f(x) = \frac{1}{\sqrt{2 \pi \sigma^2}} e^{-(x - \mu)^2 / 2 \sigma^2}
$$
Cumulative distribution function (c.d.f):
The cumulative distribution function has the form
$$
F(t) = \int_{-\infty}^t \frac{1}{\sqrt{2 \pi \sigma^2}} e^{-(x - \mu)^2 / 2 \sigma^2} dx
$$
but this integral does not have a closed form solution and must be
approximated numerically. The c.d.f. of a standard Normal is sometimes
called the "error function". The notation \(\Phi(t)\) also stands
for the c.d.f. of a standard Normal evaluated at \(t\). Z-tables
list the value of \(\Phi(t)\) for various \(t\).
Moment generating function (m.g.f):
$$
E(e^{tX}) = e^{\mu t + \sigma^2 t^2 / 2}
$$