Important Distributions
Important Distributions
Important distributions
Frank Hansen
Department of Economics
Copenhagen University
2022
Moment generating function
Definition
Let X be a random variable and suppose that exp(tX ) has finite mean
for every t in an open interval I with 0 ∈ I.
We then define
ψ(t) = E[etX ] t ∈I
and call it the moment generating function for X .
We state without proof that two distributions are identical if they have
moment generating functions coinciding in an open interval around 0.
2 / 24
Linear combinations of independent variables
Theorem
Let X1 , . . . , Xn be independent stochastic variables with moment
generating functions ψ1 , . . . , ψn defined in open intervals I1 , . . . , In .
The moment generating function of X = a1 X1 + · · · + an Xn is
Proof. By definition
n
hY i
tX t(a1 X1 +···+an Xn )
ψ(t) = E[e ] = E[e ]=E etai Xi .
i=1
3 / 24
The exponential distribution
fX (x) = λe−λx x ≥ 0.
4 / 24
The expectation of the exponential distribution
The moment generating function ψ for the exponential distribution, with
parameter λ > 0, is given by
Z ∞ Z ∞
ψ(t) = E[e ] =tX tx
e fX (x) dx = etx λe−λx dx
−∞ 0
Z ∞ ix=∞
λ h λ
=λ e(t−λ)x dx = e(t−λ)x =
0 t −λ x=0 λ−t
1
E[X ] = ψ 0 (0) = .
λ
5 / 24
Higher moments of the exponential distribution
We calculate the second derivative of ψ and obtain
2λ
ψ 00 (t) = .
(λ − t)3
Therefore, the second moment
2
E[X 2 ] = ψ 00 (0) =
λ2
and the variance
2 1 1
Var[X ] = E[X 2 ] − E[X ]2 = 2
− 2 = 2.
λ λ λ
More generally
λn!
ψ (n) (t) = t < λ.
(λ − t)n+1
The nth moment of the exponential distribution is thus
E[X n ] = ψ n (0) = λ−n n! .
6 / 24
The binomial distribution
A random variable X is said to have the Bernoulli distribution with
parameter p, for 0 ≤ p ≤ 1, if it only takes the values 0 and 1 and
P[X = 1] = p and P[X = 0] = 1 − p.
The moment generating function is
ψ(t) = E[etX ] = et P[X = 1] + e0 P[X = 0] = pet + 1 − p.
The binomial distribution with parameters (n, p) is the distribution of
the sum X of n independent Bernoulli distributed variables, each with
parameter p. The variable X takes integer values from 0 to n and
n
P[X = k ] = pk (1 − p)n−k k = 0, 1, . . . , n.
k
The moment generating function is
ψ(t) = (pet + 1 − p)n .
7 / 24
Sum of binomially distributed variables
8 / 24
The Poisson distribution
A stochastic variable X that takes integer values x = 0, 1, 2, . . . is
Poisson distributed with parameter λ > 0 if
λx
P[X = x] = e−λ x = 0, 1, 2, . . . .
x!
The moment generating function ψ is given by
∞ ∞
X X λx
ψ(t) = E[etX ] = etx P[X = x] = etx e−λ
x!
x=0 x=0
∞
X (λet )x
= e−λ = e−λ exp(λet ).
x!
x=0
E[X ] = ψ 0 (0) = λ.
9 / 24
Characterisation of the Poisson distribution
Suppose that phone calls arrive independently and at random.
Let X denote the stochastic variable that counts the number of phone
calls arriving in a certain time interval.
Theorem
X is Poisson distributed with parameter λ = E[X ].
Outline of proof:
We subdivide the time interval into n slots of equal length.
We choose n so large that the probability of two phone calls arriving in
the same slot becomes very small.
(i) The probability p that a phone call arrives in any chosen slot is
λ
p= where λ = E[X ].
n
10 / 24
Continuation of proof I
(ii) The probability that exactly k phone calls arrive in the given time
interval is given by
n
P[X = k ] = pk (1 − p)n−k
k
In conclusion,
λk
P[X = 0] = e−λ and thus P[X = k ] = e−λ
k!
showing that X is Poisson distributed with parameter λ.
12 / 24
Sum of independent Poisson distributed variables
13 / 24
The normal distribution
A stochastic variable X : S → R is said to be normally distributed with
parameters (m, σ) if it has a density function fX given on the form
(x − m)2
1
fX (x) = √ exp − ,
σ 2π 2σ 2
where σ > 0 and m ∈ R.
We must prove that the positive function fX is a density. We make the
substitution y = (x − m)/σ and obtain
Z ∞ Z ∞
(x − m)2
1
fX (x) dx = √ exp − dx
−∞ −∞ σ 2π 2σ 2
Z ∞ 2
1 y
=√ exp − dy = 1.
2π −∞ 2
16 / 24
Linear combinations of independent normal variables
Let X1 , . . . , Xk be independent stochastic variables.
We assume that each Xi is normally distributed with parameters
(mi , σi ) for i = 1, . . . , k .
The moment generating function for Xi is
1
ψi (t) = exp mi t + σi2 t 2
i = 1, . . . , k .
2
Since the stochastic variables are independent the moment generating
function ψ for the X = a1 X1 + · · · + ak Xk is given by
17 / 24
An affine function of a normally distributed variable
Y = aX + b.
E[Y ] = am + b
and variance
Var[Y ] = a2 σ 2 .
18 / 24
Samples from the normal distribution
A random sample (X1 , . . . , Xn ) from the normal distribution with
parameters (m, σ) is a vector of independent stochastics variables
X1 , . . . , Xn
which are normally distributed with the same parameters (m, σ).
The stochastic variable
X1 + · · · + Xn
Xn =
n
is called the sample mean. Since the variables are independent it
follows that X n is normally distributed with mean
1
E[X n ] = m + ··· + m = m
n
and variance
1 2 2
σ2
Var[X n ] = σ + · · · + σ = .
n2 n
19 / 24
The multi-normal distribution
Consider a vector valued stochastic variable X : S → Rn .
20 / 24
The vector of means and the covariance matrix
The vector of means is given by
where
Y 0 = BX 0
xi = ti ei i = 1, . . . , n
22 / 24
Proof
If B is an invertible n × n matrix then
Vol(∆u) = t1 · · · tn .
Then the stochastic vector BX 0 takes values in u + B∆u. Note that the
polytope B∆u has volume Vol(B∆u) = t1 · · · tn · | det B|. We therefore
obtain the relationship
√
det((B 0 )−1 AB −1 )
exp − 12 (B 0 )−1 AB −1 (u − Bm) | (u − Bm) .
= (2π)n/2
24 / 24