## Glossary

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

# Gaussian distribution

The Gaussian distribution or the normal distribution were in fact invented long before C.F. Gauss was born in 1777 and goes at least back to an article by A. de Moivre in 1733. The main reason for the importance of the normal distribution is the central limit theorem (CLT). The simple version of the CLT states that for a sum of n independent and identically distributed (iid) random variables with zero expectation and finite variance, σ2, here denoted Sn, we have that Sn/(n1/2σ) converges in distribution to a normal distribution with zero expectation and variance 1 as n goes to infinity. This makes it reasonable to approximate many random variables with a normal distribution. An important application of the CLT is to measure the rate of convergence in the law of large numbers (LLN). The LLN says that the average of n iid random variables, xn with finite expectation μ converges with probability one to μ as n goes to infinity. Now if the random variables also have finite variance σ2 the CLT will gives us that n1/2(xn-μ) converges in distribution to a normal distribution with expectation zero and variance σ2 and thus that also the convergence rate in the LLN is n1/2. This also makes it possible to approximate the error coming from trying the estimate the expectation, μ, of a distribution by using an average of n independent observations from the distribution. i.e. μ≈xn+σG/n1/2, where G is a standard Gaussian random variable (expectation zero and variance 1). It is then possible to find a constant λα/2 such that P(|μ-xn|≥ σλα/2/n1/2)≈ α or to put it differently the probability that μ lies in the interval [xn-σλα/2/n1/2,xn+σλα/2/n1/2] is approximately 1-α. The constant λα/2 can be found be solving the equation N(x)=1-α/2 for x, where N is the distribution function of the standard normal distribution. The function N or its inverse are not obtainable in closed form but they are fairly easy to approximate.

## Characteristics of the normal distribution

• The density function f(x) of a normal distribution with mean μ and variance σ2 is given by
e-((x-μ)/σ)2/2/(2πσ2)1/2.
• A multi variate n-dimensional normal distribution with mean vector Μ and covariance matrix Σ has density function
f(x)=e-(x-Μ)TΣ-1(x-Μ)T/2/(2π)n/2/|Σ|1/2
• The moment generating function of a normal random variable X with mean μ and variance σ2 is given by
M(z)=E[exp(zX)]=eμz+(zσ)2/2.
• The moment generating function of a normal random vector X with mean Μ and covariance Σ is given by
M(z)=E[exp(zTX)]=eΜTz+zTΣz/2.
• The normal distribution is a location/scale family where μ is the location parameter and σ the scale parameter.
• A normal random variable X mean μ and variance σ2 can be obtained by an affine transformation of standard normal random variable G in the following way X=μ+σG.
• A normal random vector X with mean Μ and covariance matrix Σ can be obtained by an affine transformation of a standard normal random vector G in the following way X=Μ+Σ1/2G.
• The sum of two normal random variables X and Y with mean μX and μY and standard deviations σX and σY is normally distributed with mean μXY and variance σX2Y2+2C(X,Y) if X and Y are multivariate normal.
• Two jointly normally distributed random variables X and Y are independent if and only if they have zero covariance (or correlation).
• If X and Y are independent and X+Y is normally distributed then X and Y must be normally distributed.
• A remarkable fact about the normal distribution is that it is the only distribution such that if X and Y are independent and with the same variance then X+Y and X-Y are also independent. In general we have that a zero mean multivariate normal vector with iid components have invariant distribution under orthogonal transformations.
• The only non-trivial random process with independent and identically distributed increments which has continuous sample paths is a Gaussian stochastic process called Brownian motion.