CH 4 Slides
CH 4 Slides
Kuan Xu
University of Toronto
kuan.xu@utoronto.ca
1 Introduction
2 The Probability Distribution for a Continuous Random Variable
3 The Expected Value for Continuous Random Variables
4 The Uniform Probability Distribution
5 The Normal Probability Distribution
6 The Gamma Probability Distribution
7 The Beta Probability Distribution
8 Some General Comments
9 Other Expected Values
10 Tchebysheff’s Theorem
Distribution Function
Let Y denote any random variable. The distribution function of Y ,
denoted by F (y ), is such that F (y ) = P(Y ≤ y ) for −∞ < y < ∞.
Note that the probability distribution is for any random variable (discrete
or continuous random variable). It must satisfies a set of properties.
Note that the above properties must be satisfied by any random variable.
Now we are ready to define a continuous random variable.
Example:
Let
0, for y < 0,
F (y ) = y , for 0 ≤ y ≤ 1,
1, for y > 1.
(a) F (y ) (b) f (y )
Quantile
Let Y denote any random variable. If 0 < p < 1, the pth quantile of Y ,
denoted by ϕp , is the smallest value such that P(Y ≤ ϕp ) = F (ϕp ) ≥ p. If
Y is continuous, ϕp is the smallest value such that
F (ϕp ) = P(Y ≤ ϕp ) = p.
Theorem 4.3—P(a ≤ Y ≤ b)
If the random variable Y has density function f (y ) and a < b, then the
probability that Y falls in the interval [a, b] is
Z b
P(a ≥ Y ≥ b) = f (y )dy = F (b) − F (a).
a
Figure: P(a ≤ Y ≤ b)
Theorem 4.4—E (g (Y ))
Let g (Y ) be a function of Y ; t hen the expected value of g (Y )′ is given by
Z ∞
E [g (Y )] = g (y )f (y )dy ,
−∞
Theorem 4.5—Properties
Let c be a constant and let g (Y ), g1 (Y ), g2 (Y ), . . . , gk (Y ) be functions of
a continuous random variable Y . Then the following results hold:
1 E (c) = c.
2 E [cg (Y )] = cE [g (Y )].
3 E [g1 (Y ) + g2 (Y ) + · · · + gk (Y )] =
E [g1 (Y )] + E [g2 (Y )] + · · · + E (gk (Y )).
2
Z 2 Z 2
2 2 2 4 5
E (Y ) = y (3/8)y dy = (3/8)y dy = (3/8)(1/5)y = 2.4.
0 0
0
2 2 2
V (Y ) = E (Y ) − [E (Y )] = 2.4 − 1.5 = 2.4 − 2.25 = .15.
Figure: Density of Y
Parameters
The constants that determine the specific form of a density function are
called parameters of the density function.
θ22 − θ12 2 2
= ∵ θ2 − θ1 = (θ2 + θ1 )(θ2 − θ1 )
2(θ2 − θ1 )
θ2 + θ1
= .
2
θ23 − θ13 3 3 2 2
= ∵ θ2 − θ1 = (θ2 − θ1 )(θ2 + θ2 θ1 + θ1 )
3(θ2 − θ1 )
θ22 + θ2 θ1 + θ12
= .
3
2 2
V (Y ) = E (Y ) − [E (Y )]
θ22 + θ2 θ1 + θ12 θ2 + θ1 2
= −[ ]
3 2
4(θ22 + θ2 θ1 + θ12 ) 3(θ2 + θ1 )2
= −[ ]
12 12
4(θ22 + θ2 θ1 + θ12 ) 3(θ22 + 2θ2 θ1 + θ12 )
= −[ ]
12 12
(θ22 − 2θ2 θ1 + θ12 )
=
12
(θ2 − θ1 )2
= .
12
1 (y −µ)2
f (y ) = √ e− 2σ 2 ,
2πσ 2
where −∞ < y < ∞.
80 − 75
z1 = = .5
10
90 − 75
z2 = = 1.5
10
P(80 ≤ Y ≤ 90) = P(.5 ≤ Z ≤ 1.5) = .2471
where Z ∞
Γ(α) = y α−1 e −y dy .
0
µ = E (Y ) = αβ
and
σ 2 = V (Y ) = αβ 2 .
By definition,
y α−1 e −y /β
Z ∞ !
dy = 1.
0 β α Γ(α)
Hence, Z ∞
α−1 −y /β α
y e dy = β Γ(α).
0
Z ∞
1 α −y /β
βαΓ(α)
E (Y ) = y e dy = = αβ.
β α Γ(α) 0 Γ(α)
| {z }
β α+1 Γ(α+1)
Find E (Y 2 ).
y α−1 e −y /β
Z ∞ ! Z ∞
2 2 1 α+1 −y /β
E (Y ) = y dy = y e dy
0 β α Γ(α) β α Γ(α) 0
2
1 α+2 β (α + 1)αΓ(α) 2
= [β Γ(α + 2)] = = α(α + 1)β
β α Γ(α) Γ(α)
Find V (Y ) = E (Y 2 ) − [E (Y )]2 .
2 2 2 2 2 2 2 2
V (Y ) = α(α + 1)β − (αβ) = α β + αβ − α β = αβ .
µ = E (Y ) = v
and
σ 2 = V (Y ) = 2v .
µ = E (Y ) = β
and
σ 2 = V (Y ) = β 2 .
Example: Let Y be the length of life of electronic components and it follows an exponential probability density function. In
addition, if a > 0 and b > 0, show P(Y > a + b|Y > a) = P(Y > b).
Solution: We know
P(Y > a + b)
P(Y > a + b|Y > a) = ,
P(Y > a)
because the intersection of events (Y > a + b) and (Y > a) is the event (Y > a + b). Note
Z ∞ ∞
1 −y /β −y /β −(a+b)/β
P(Y > a + b) = e dy = −e =e .
a+b β a+b
Similarly, Z ∞
1 −y /β −a/β
P(Y > a) = e dy = e .
a β
e −(a+b)/β −b/β
P(Y > a + b|Y > a) = =e = P(Y > b).
e −a/β
where Z 1
Γ(α)Γ(β)
B(α, β) = y α−1 (1 − y )β−1 dy = .
0 Γ(α + β)
Note that we are not restricted to 0 ≤ y ≤ for the beta density function as
we can always transform the random variable over the interval c ≤ y ≤ d
y −c
by defining y ∗ = d−c such that 0 ≤ y ∗ ≤ 1.
The cumulative distribution function for the beta random variable is
commonly called the incomplete beta function and is denoted by
Z y α−1
t (1 − t)β−1
F (y ) = dt = Iy (α, β)
0 B(α, β)
To find µ = E (Y ).
µ = E (Y )
Z ∞
= yf (y )dy
−∞
y α−1 (1 − y )β−1
Z 1 !
= y dy
0 B(α, β)
Z 1
1 α β−1
= y (1 − y ) dy
B(α, β) 0
B(α + 1, β)
= ∵α>0⇒α+1>0
B(α, β)
Γ(α + 1)Γ(β) Γ(α + β)
= ·
Γ(α + β + 1) Γ(α)Γ(β)
αΓ(α)Γ(β) Γ(α + β)
= · ∵ Γ(n) = (n − 1)Γ(n − 1)
(α + β)Γ(α + β) Γ(α)Γ(β)
α
= .
α+β
Z 1
2 Γ(α + β) α+1 β−1
E (Y ) = y (1 − y ) dy
Γ(α)Γ(β) 0
Γ(α + β) Γ(α + 2)Γ(β)
= ·
Γ(α)Γ(β) Γ(α + β + 2)
Γ(α + β) (α + 1)Γ(α + 1)Γ(β)
= ·
Γ(α)Γ(β) (α + β + 1)Γ(α + β + 1)
Γ(α + β) (α + 1)αΓ(α)Γ(β)
= ·
Γ(α)Γ(β) (α + β + 1)(α + β)Γ(α + β)
(α + 1)α
= .
(α + β + 1)(α + β)
2 2 2
σ = V (Y ) = E (Y ) − [E (Y )]
2
(α + 1)α α
= −
(α + β + 1)(α + β) α+β
α2 + α α2
= −
(α + β + 1)(α + β) (α + β)2
(α2 + α)(α + β) α2 (α + β + 1)
= −
(α + β + 1)(α + β)2 (α + β + 1)(α + β)2
α3 + α2 β + α2 + αβ α3 + α2 β + α2
= −
(α + β + 1)(α + β)2 (α + β + 1)(α + β)2
αβ
=
(α + β + 1)(α + β)2
µk = E [(Y − µ)k ], k = 1, 2, . . . .
θk
= .
k +1
Thus,
θ
µ′1 = µ = ,
2
θ2
mu2′ = ,
3
and
θ3
mu3′ = .
4
Kuan Xu (UofT) ECO 227 November 23, 2023 53 / 64
Other Expected Values (3)
Moment-generating Function
If Y is a continuous random variable, then the moment-generating
function of Y is given by
m(t) = E (e tY ).
Note
Z ∞
tY ty
E (e ) = e f (y )dy
−∞
t2y 2 t3y 3 x2 x3
Z ∞ !
x
= 1 + ty + + + ··· f (y )dy ∵ e = 1 + x + + + ···
−∞ 2! 3! 2! 3!
t2 t3
Z ∞ Z ∞ Z ∞ Z ∞
2 3
= f (y )dy + t yf (y )dy + y f (y )dy + y f (y )dy + · · ·
−∞ −∞ 2! −∞ 3! −∞
′ t2 ′ t3 ′
= 1 + tµ1 + µ2 + µ3 + · · ·
2! 3!
Also note
d k m(t) ′
= µk .
dt k t=0
Z ∞ " α−1 −y /β #
tY ty y e
m(t) = E (e )=
e dy
0 β α Γ(α)
Z ∞
1 1
α−1 x
= y exp −y −t dy exp(x) = e here
β α Γ(α) 0 β
Z ∞ " #
1 α−1 −y
= y exp dy
β α Γ(α) 0 β/(1 − βt)
Note that the integral of the variable factor of any density function must equal the reciprocal of the constant factor. That is, if
f (y ) = cg (y ), where c is a constant, then
Z ∞ Z ∞
f (y )dy = cg (y )dy = 1
−∞ −∞
so Z ∞
1
g (y )dy = .
−∞ c
1 ).
Applying the above result to the integral in m(t) noting [β/(1 − βt)] > 0 (when β > 0, (1 − βt) > 0 which implies t < β
α−1
g (y ) = y exp{−y /[β/(1 − βt)]}
is the variable factor of a gamma density function with parameters α > 0 and β/(1 − βt) > 0. Thus, we obtain
α
1 β 1
m(t) = Γ(α) = for t < 1/β.
β α Γ(α) 1 − βt (1 − βt)α
Example: Expand the moment-generating function of the above example into a power series in t and thereby obtain µ′k .
Solution: Recall m(t) = (1 − βt)−α . Using the expansion for a binomial term of the form (x + y )n ,1 we have
−α −α−1
m(t) = (1 − βt) = 1 + (−α)(1) (−βt)
′
µ1 = µ1 = αβ, (See Theorem 4.8)
′ 2
µ2 = α(α + 1)β ,
′ 3
µ3 = α(α + 1)(α + 2)β ,
1
Given Cjn = nj = j!(n−j)! , (x + y )n = y = n0 x n y 0 + n1 x n−1 y 1 + n2 x n−2 y 2 + · · · + nn x 0 y n .
n! Pn n−j j
j=0 x
Kuan Xu (UofT) ECO 227 November 23, 2023 58 / 64
Other Expected Values (8)
Example: First, let Z ∼ N(0, 1). Find the moment-generating-function of Z as follows:
Z ∞
Zt zt 1 − 1 z2
mZ (t) = E (e ) = e √ e 2 dz
−∞ 2π
Z ∞ Z ∞
1 zt− 1 z 2 1 − 1 z 2 +zt− 1 t 2 + 1 t 2
= √ e 2 dz = √ e 2 2 2 dz
−∞ 2π −∞ 2π
Z ∞ Z ∞
1 1
− (z−t) + t2 1 2 1 t 2 1 − (z−t)2
1
= √ e 2 2 dz = e 2 √ e 2 dz
−∞ 2π −∞ 2π
| {z }
=1
Second, let Y = a + bZ , where a and b are constants and Z ∼ N(0, 1). Find the moment-generating function for Y .
tY t(a+bZ ) ta tbZ
mY (t) = E (e ) = E (e ) = E (e e )
ta tbZ ta
= e E (e ) = e mZ (tb).
Let a = µ and b = σ. Then
µt µt 1 (tσ)2 µt+ 1 t 2 σ 2
mY = e mZ (tσ) = e e2 =e 2 .
2
This result can be verified. We can use mY (t) to show Y ∼ N(µ, σ ).
µt+ 1 t 2 σ 2
d e 2
dmY (t) 1 t 2 σ2
2 µt+ 2
= = (µ + tσ )e
dt t=0
dt t=0 t=0
= µ = E (Y ).
µt+ 1 t 2 σ 2
d (µ + tσ 2 )e 2
d 2 mY (t)
=
dt dt
Recall duv
dx
= uv ′ + vu ′ .
µt+ 1 t 2 σ 2
d (µ + tσ 2 )e 2
1 t 2 σ2 2 µt+ 1 t 2 σ2
2 2 µt+ 2
= (µ + tσ ) e +σ e 2
dt t=0
2 2 2
= µ + σ = E (Y ).
2 2 2 2 2 2
V (Y ) = E (Y ) − [E (Y )] = µ + σ − µ = σ .
Z ∞
1 tu −u 2 /(2σ 2 )
m(t) = √ e e du
σ 2π −∞
Z ∞
1 1
2 2
= √ exp − (u − 2σ tu) du
σ 2π −∞ 2σ 2
2 2
Multiply and divide the right-hand-side by e t σ /2 to get
exp[−(1/2σ 2 )(u 2 − 2σ 2 tu + σ 4 t 2 )]
Z ∞
t 2 σ 2 /2
m(t) = e √ du
−∞ σ 2π
Z ∞ 2 2 2
t 2 σ 2 /2 exp[−(u − σ t) /2σ ]
= e √ du.
−∞ σ 2π
Note
Z ∞
2 2
V (Y ) = σ = (y − µ) f (y )dy
−∞
Z µ−kσ Z µ+kσ Z ∞
2 2 2
= (y − µ) f (y )dy + (y − µ) f (y )dy + (y − µ) f (y )dy
−∞ µ−kσ µ+kσ
R µ+kσ
Note µ−kσ (y − µ)2 f (y )dy ≥ 0. Also note (y − µ)2 ≥ k 2 σ 2 for all values of y between −∞ and µ − kσ or between
µ + kσ and ∞. Therefore,
Z µ−kσ Z ∞
2 2 2 2 2
V (Y ) = σ ≥ k σ f (y )dy + k σ f (y )dy
−∞ µ+kσ
2 2 2 2 2
σ ≥ k σ [P(Y ≤ µ − kσ) + P(Y ≥ µ + kσ)] = k σ P(|Y − µ| ≥ kσ).
2 2
Dividing by k σ , we obtain
1
P(|Y − µ| ≥ kσ) ≤ ,
k2
or, equivalently,
1
P(|Y − µ| < kσ) ≥ 1 − ,
k2