0% found this document useful (0 votes)
15 views30 pages

EPS - Chapter - 5 - Continuous Distributions - JNN - OK

Chapter 5 discusses common continuous distributions, focusing primarily on the normal distribution, which is defined by its probability density function and characterized by its mean (µ) and variance (σ²). It explains the properties of the normal distribution, including its symmetry and the significance of the standard normal distribution (Z) with mean 0 and standard deviation 1. Additionally, the chapter introduces the uniform distribution, detailing its probability density function and providing examples to illustrate the concepts.

Uploaded by

sseguyapaul02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views30 pages

EPS - Chapter - 5 - Continuous Distributions - JNN - OK

Chapter 5 discusses common continuous distributions, focusing primarily on the normal distribution, which is defined by its probability density function and characterized by its mean (µ) and variance (σ²). It explains the properties of the normal distribution, including its symmetry and the significance of the standard normal distribution (Z) with mean 0 and standard deviation 1. Additionally, the chapter introduces the uniform distribution, detailing its probability density function and providing examples to illustrate the concepts.

Uploaded by

sseguyapaul02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Chapter 5

Common Continuous Distributions

5.1 Normal Distribution


5.1.1 Introduction
The normal distribution is, by far, the most important continuous distribution in the entire
field of statistics. It was introduced by a French mathematician Abraham de Moivre in 1733,
a year in which he developed the mathematical equation of its curve.

The normal distribution is also referred to as the Gaussian distribution in honor of Gauss
(1777 − 1855), who also derived from a study of errors in repeated measurements of the same
quantity.

Definition 5.1.1 A random variable X is said to have a Normal distribution with parameters
µ and σ 2 if its pdf is given by
1 1 2 2
f (x) = √ e− 2 (µ−x) /σ (5.1)
2πσ 2

or restated as
1 (x − µ)2
 
2 1
f (x, µ, σ ) = √ exp − −∞ < x < ∞, σ 2 > 0
2πσ 2 2 σ2

where π ≈ 3.14159 and e ≈ 2.71828 is the base of the natural logarithms.

Remark 5.1.1 Any distribution defined by a density function given above is a normal dis-
tribution. Thus, X is said to normally distributed with the mean µ and variance σ 2 written
as
X ∼ N (µ, σ 2 )
The name normal originated in connection with the theory of errors of measurement in eigh-
teenth century, when it was found that errors ‘normally’ had the distribution function given by
equation of f (x) above.

131
5.1. NORMAL DISTRIBUTION

Example 5.1.1 Each different choice of specific numerical values for the pair µ and σ gives
a different bell curve. The value of µ determines the location of the curve, as shown in Figure
5.1. In each case the curve is symmetric about µ.

Figure 5.1: Bell Curves with σ = 0.25 and Different Values of µ

Example 5.1.2 The value of σ determines whether the bell curve is tall and thin or short
and squat, subject always to the condition that the total area under the curve be equal to 1.
This is shown in Figure 5.2, where we have arbitrarily chosen to center the curves at µ = 6.

Figure 5.2: Bell Curves with µ = 6 and Different Values of σ.

5.1.2 Definition: normal distribution


Definition 5.1.2 The probability distribution corresponding to the density function for the
bell curve with parameters µ and σ is called the normal distribution with mean µ and standard
deviation σ.

jnn & ok elements of probability & statistics Page 132 of 219


5.1. NORMAL DISTRIBUTION

5.1.3 Definition: normally distributed random variable


Definition 5.1.3 A continuous random variable whose probabilities are described by the
normal distribution with mean µ and standard deviation σ is called a normally distributed
random variable, or a normal random variable for short, with mean µ and standard deviation
σ.
Example 5.1.3 Figure 5.3 shows the density function that determines the normal distribution
with mean µ and standard deviation σ. We repeat an important fact about this curve: The
density curve for the normal distribution is symmetric about the mean.

Figure 5.3: Density Function for a Normally Distributed Random Variable with Mean µ and
Standard Deviation σ

Example 5.1.4 Heights of 25-year-old men in a certain region have mean 69.75 inches and
standard deviation 2.59 inches. These heights are approximately normally distributed. Thus
the height X of a randomly selected 25-year-old man is a normal random variable with mean
µ = 69.75 and standard deviation σ = 2.59. Sketch a qualitatively accurate graph of the density
function for X. Find the probability that a randomly selected 25-year-old man is more than
69.75 inches tall.
Solution : The distribution of heights looks like the bell curve in Figure 5.4. The
important point is that it is centered at its mean, 69.75, and is symmetric about the
mean.

Figure 5.4: Density Function for Heights of 25-Year-Old Men

Since the total area under the curve is 1, by symmetry the area to the right of 69.75
is half the total, or 0.5. But this area is precisely the probability P (X > 69.75), the
probability that a randomly selected 25-year-old man is more than 69.75 inches tall.

jnn & ok elements of probability & statistics Page 133 of 219


5.1. NORMAL DISTRIBUTION

5.1.4 The Standard Normal Distribution


Definition 5.1.4 A standard normal random variable is a normally distributed random
variable with mean µ = 0 and standard deviation σ = 1. It will always be denoted by the letter
Z.

Note 5.1.1 The density function for a standard normal random variable is shown in Figure
5.5.

Figure 5.5: Density Curve for a Standard Normal Random Variable

Remark 5.1.2 To compute probabilities for Z we will not work with its density function
directly but instead read probabilities out of positive normal table under Section 8.3.2. The
tables are tables of cumulative probabilities; their entries are probabilities of the form P (Z < z).
The use of the tables will be explained by the following series of examples.

jnn & ok elements of probability & statistics Page 134 of 219


5.1. NORMAL DISTRIBUTION

Example 5.1.5 Find the probabilities indicated, where as always Z denotes a standard normal
random variable.
1.) P (Z < 1.48).
Solution : Figure 5.6 shows how this probability is read directly from the
positive normal table without any computation required. The digits in the ones
and tenths places of 1.48, namely 1.4, are used to select the appropriate row of the
table; the hundredths part of 1.48, namely 0.08, is used to select the appropriate
column of the table. The four decimal place number in the interior of the table that
lies in the intersection of the row and column selected, 0.9306, is the probability
sought:
P (Z < 1.48) = 0.9306

Figure 5.6: Computing Probabilities Using the Cumulative Table


2.) P (Z < −0.25).
Solution : The minus sign in −0.25 makes no difference in the procedure; the
table is used in exactly the same way as in previous part - only that we now use the
negative normal table in Section 8.3.1: the probability sought is the number that
is in the intersection of the row with heading −0.2 and the column with heading
0.05, the number 0.4013. Thus P (Z < −0.25) = 0.4013. ■

Example 5.1.6 Find the probabilities indicated.


1.) P (Z > 1.60).
Solution : Because the events Z > 1.60 and Z ≤ 1.60 are complements, the
Probability Rule for Complements implies that
P (Z > 1.60) = 1 − P (Z ≤ 1.60)
Since inclusion of the endpoint makes no difference for the continuous random
variable Z, P (Z ≤ 1.60) = P (Z < 1.60), which we know how to find from the
table. The number in the row with heading 1.6 and in the column with heading
0.00 is 0.9452. Thus P (Z < 1.60) = 0.9452 so
P (Z > 1.60) = 1 − P (Z ≤ 1.60) = 1 − 0.9452 = 0.0548

jnn & ok elements of probability & statistics Page 135 of 219


5.1. NORMAL DISTRIBUTION

Geometrically, since the total area under the curve is 1 and the area of the region
to the left of 1.60 is (from the table) 0.9452, the area of the region to the right of
1.60 must be 1 − 0.9452 = 0.0548. ■
2.) P (Z > −1.02).
Solution : The minus sign in −1.02 makes no difference in the procedure;
the negative normal table is used in exactly the same way. The number in the
intersection of the row with heading −1.0 and the column with heading 0.02 is
0.1539. This means that P (Z < −1.02) = P (Z ≤ −1.02) = 0.1539. Hence
P (Z > −1.02) = 1 − P (Z ≤ −1.02) = 1 − 0.1539 = 0.8461

Example 5.1.7 Find the probabilities indicated.
1. P (0.5 < Z < 1.57).
Solution :
P (0.5 < Z < 1.57) = P (Z < 1.57) − P (Z < 0.50) = 0.9418 − 0.6915 = 0.2503

2. P (−2.55 < Z < 0.09).
Solution :
P (−2.55 < Z < 0.09) = P (Z < 0.09)−P (Z < −2.55) = 0.5359−0.0054 = 0.5305

Example 5.1.8 Find the probabilities indicated.
1.) P (1.13 < Z < 4.16).
Solution : We attempt to compute the probability exactly by looking up the
numbers 1.13 and 4.16 in the table. We obtain the value 0.8708 for the area of
the region under the density curve to left of 1.13 without any problem, but when
we go to look up the number 4.16 in the table, it is not there. We can see from
the last row of numbers in the table that the area to the left of 4.16 must be so
close to 1 that to four decimal places it rounds to 1.0000. Therefore
P (1.13 < Z < 4.16) = P (Z < 4.16) − P (Z < 1.13) = 1.0000 − 0.8708 = 0.1292

2.) P (−5.22 < Z < 2.15).
Solution : Similarly, here we can read directly from the table that the area
under the density curve and to the left of 2.15 is 0.9842, but −5.22 is too far to
the left on the number line to be in the table. We can see from the first line of the
table that the area to the left of −5.22 must be so close to 0 that to four decimal
places it rounds to 0.0000. Therefore
P (−5.22 < Z < 2.15) = P (Z < 2.15)−P (Z < −5.22) = 0.9842−0.0000 = 0.9842

jnn & ok elements of probability & statistics Page 136 of 219


5.1. NORMAL DISTRIBUTION

Remark 5.1.3 The next Example 5.1.9 of this section explains the origin of the proportions
given in the Empirical Rule.

Example 5.1.9 Find the probabilities indicated.

1.) P (−1 < Z < 1).

Solution : Using the table we obtain

P (−1 < Z < 1) = 0.8413 − 0.1587 = 0.6826

Since Z has mean 0 and standard deviation 1, for Z to take a value between
−1 and 1 means that Z takes a value that is within one standard deviation of
the mean. Our computation shows that the probability that this happens is about
0.68, the proportion given by the Empirical Rule for histograms that are mound
shaped and symmetrical, like the bell curve. ■

2.) P (−2 < Z < 2).

Solution : Using the negative table in the same way,

P (−2 < Z < 2) = 0.9772 − 0.0228 = 0.9544

This corresponds to the proportion 0.95 for data within two standard deviations
of the mean. ■

3.) P (−3 < Z < 3).

Solution : Similarly,

P (−3 < Z < 3) = 0.9987 − 0.0013 = 0.9974

which corresponds to the proportion 0.997 for data within three standard devia-
tions of the mean. ■

jnn & ok elements of probability & statistics Page 137 of 219


5.2. THE UNIFORM OR RECTANGULAR DISTRIBUTION

5.2 The Uniform or Rectangular Distribution


Definition 5.2.1 A continuous random variable X is said to be uniformly distributed if its
probability density function is given by
 1
 ; a<x<b
b−a

f (x) = (5.2)


0 ; otherwise

To derive the pdf of a uniform random variable, we consider a constant function f (x) = c on
a≤x≤b

f (x)

c f (x) = c

a b x

Figure 5.7: Area of a Uniform distribution

Area = (length) × (height)


1 = (b − a)(c)
1
⇒ c =
b−a
1
⇒ f (x) =
b−a
thus
1
f (x) = , on a ≤ x ≤ b
b−a

jnn & ok elements of probability & statistics Page 138 of 219


5.2. THE UNIFORM OR RECTANGULAR DISTRIBUTION

Example 5.2.1 The number of goals scored during the African cup of Nation 2013 is uniformly
distributed with a probability density function
1
f (x) = , 1≤x≤7
6
Compute the

1.) Probability that the number of goals scored in any game is less or equal to 3.
Z 3
1 2
dx =
1 6 6

2.) Expected number of goals in any game at the African Cup of Nations 2013?
Z ∞
E[X] = xf (x)dx
−∞
Z 7
x
= dx
1 6
 2 7
x 49 1 48
= = − =
12 1 12 12 12
= 4

3.) The final is a nil-nil at the end of the game, P (X = 0). The probability does not exist since
we do not have a distribution below x = 1.

Exercise 5.1 For a uniform distribution, show that the CDF (cumulative distribution func-
tion) is given by 


 0 ; x<a




x − a
F (x) = ; x ∈ [a, b)

 b−a





1 ; x≥b

Exercise 5.2 The continuous random variable X is uniformly distributed over the interval
[−2, 7].

1.) Write down fully the probability density function f (x) of X.

2.) Sketch the probability density function f (x) of X.

3.) Compute E[X 2 ],

4.) Determine P (−0.2 < X < 0.6).

jnn & ok elements of probability & statistics Page 139 of 219


5.2. THE UNIFORM OR RECTANGULAR DISTRIBUTION

5.2.1 The Expectation of a Continuous Uniform Distribution


Definition 5.2.2 The expectation or mean of a uniform distribution is given
b+a
E(X) = (5.3)
2
Proof :
Z ∞
E(X) = xf (x)dx
−∞
Z b  b
x 1 1 2
= dx = x
a b−a b−a 2 a
b 2 − a2 (b − a)(b + a)
= =
2(b − a) 2(b − a)
b+a
=
2

5.2.2 The Variance of the Continuous Uniform Distribution


Definition 5.2.3 The variance of a uniform distribution is given by
(b − a)2
Var(X) = (5.4)
12
For the proof, we first compute E(X 2 ) using the definition,
Proof :
Z ∞
2
E(X ) = x2 f (x)dx
−∞
b
x2
Z
= dx
a b−a
 b
1 1 3
= x
b−a 3 a
 
1 1 3 3
= (b − a )
b−a 3
Thus
Var(X) = E(X 2 ) − [[E(X)]2
1 2 1
= (b + ab + a2 ) − (b + a)2
3 4
1
= (4b + 4ab + 4a − 3b2 − 6ab − 3a2 )
2 2
12
1 2
= (a − 2ab + b2 )
12
1
= (b − a)2
12
(b − a)2
=
12

jnn & ok elements of probability & statistics Page 140 of 219


5.2. THE UNIFORM OR RECTANGULAR DISTRIBUTION

Example 5.2.2 A packing line consistently packages 200 cartons per hour. After weighing
every package variation was found in weights ranging from 18.2kg to 20.4kg measured to
nearest tenths. The customer requires < 20.0kg for ergonomic reasons. Find the
1.) mean
a+b 1
µ= = (18.2 + 20.4) = 19.3kg
2 2
2.) standard deviation
(b − a) 1
σ= √ = √ (20.4 − 18.2) = 0.635kg
12 12
3.) probability that a package exceeds the customer requirement
Z 20.4 Z 20.4
1 1 20.4 − 20.0
P (X ≥ 20.0) = dx = dx = = 0.1818
20.0 b − a 20.0 20.4 − 18.2 20.4 − 18.2
18.2% chance that a package exceeds conforming specifications or otherwise called the Voice
of the Customer.
Exercise 5.3 A coffee machine dispenses coffee into cups. It is controlled electronically to
randomly cut off the flow of coffee between 180 ml to 190 ml. Find the probability that the
machine dispenses
1.) less than 188 ml
2.) exactly 188 ml
3.) between 182 ml and 186 ml
Exercise 5.4 The time in minutes that Elaine takes to checkout at her local supermarket
follows a continuous uniform distribution over the interval [3, 9]. Find
1.) Elaine’s expected checkout time,
2.) the variance of the time taken to checkout at the supermarket,
3.) the probability that Elaine will take more than 7 minutes to checkout.
4.) Given that Elaine has already spent 4 minutes at the checkout, find the probability that
she will take a total of less than 6 minutes to checkout.
Exercise 5.5 In a game, players select sticks at random from a box containing a large number
of sticks of different lengths. The length , in cm, of a randomly chosen stick has a continuous
uniform distribution over the interval [7, 10].
A stick is selected at random from the box.
1.) Find the probability that the stick is shorter than 9.5 cm.
To win a bag of sweets, a player must select 3 sticks and wins if the length of the longest
stick is more than 9.5 cm.
2.) Find the probability of winning a bag of sweets.
To win a soft toy, a player must select 6 sticks and wins the toy if more than four of the
sticks are shorter than 7.6 cm.
3.) Find the probability of winning a soft toy.

jnn & ok elements of probability & statistics Page 141 of 219


5.2. THE UNIFORM OR RECTANGULAR DISTRIBUTION

Example 5.2.3 A random variable X has the uniform distribution on the interval [0, 1]: the
density function is f (x) = 1 if x is between 0 and 1 and f (x) = 0 for all other values of x, as
shown in Figure 5.8.

Figure 5.8: Uniform Distribution on [0,1]

(a) Find P (X > 0.75), the probability that X assumes a value greater than 0.75.

Solution : P (X > 0.75) is the area of the rectangle of height 1 and base length
1 − 0.75 = 0.25, hence is

base × height = (0.25) · (1) = 0.25

See Figure 5.9 part (a).

Figure 5.9: Probabilities from the Uniform Distribution on [0,1]

(b) Find P (X ≤ 0.2), the probability that X assumes a value less than or equal to 0.2.

Solution : P (X ≤ 0.2) is the area of the rectangle of height 1 and base length
0.2 − 0 = 0.2, hence is base × height = (0.2) · (1) = 0.2. See Figure 5.9 part (b).

(c) Find P (0.4 < X < 0.7), the probability that X assumes a value between 0.4 and 0.7.

Solution : P (0.4 < X < 0.7) is the area of the rectangle of height 1 and length
0.7 − 0.4 = 0.3, hence is base × height = (0.3) · (1) = 0.3. See Figure 5.9 part
(c). ■

jnn & ok elements of probability & statistics Page 142 of 219


5.2. THE UNIFORM OR RECTANGULAR DISTRIBUTION

Example 5.2.4 A man arrives at a bus stop at a random time (that is, with no regard for the
scheduled service) to catch the next bus. Buses run every 30 minutes without fail, hence the
next bus will come any time during the next 30 minutes with evenly distributed probability (a
uniform distribution). Find the probability that a bus will come within the next 10 minutes.

Solution : The graph of the density function is a horizontal line above the interval
from 0 to 30 and is the x-axis everywhere else. Since the total area under the curve
must be 1, the height of the horizontal line is 1/30 (Figure 5.10). The probability
sought is P (0 ≤ X ≤ 10).By definition, this probability is the area of the rectangular
region bounded above by the horizontal line f (x) = 1/30, bounded below by the x-
axis, bounded on the left by the vertical line at 0 (the y-axis), and bounded on the
right by the vertical line at 10. This is the shaded region in Figure 5.10. Its area is
the base of the rectangle times its height,

(10) · (1/30) = 1/3

. Thus P (0 ≤ X ≤ 10) = 1/3.

Figure 5.10: Probability of Waiting At Most 10 Minutes for a Bus

jnn & ok elements of probability & statistics Page 143 of 219


5.3. THE EXPONENTIAL DISTRIBUTION

5.3 The Exponential Distribution


Definition 5.3.1 A continuous random variable whose probability density function is given
by
 −λx
 λe ; x≥0
f (x) = (5.5)
0 ; x<0

for some λ > 0, is said to be an exponential random variable or is said to be exponentially


distributed with parameter λ. It usually models “waiting time”.

Examples of exponentially distributed random variables are

1.) The life time of an electronic device

2.) The time between arrivals of two successive buses

3.) The duration time of a car service

4.) Time until next earthquake occurs

5.) Waiting time for the next person to be served at the bank counter

It is a random variable since


Z ∞ Z ∞ ∞
λe−λx dx = −e−λx 0 = 1

f (x)dx =
0 0

We note that
Rb
1.) P (a < X < b) = a f (x)dx
Ra a
2.) P (X < a) = 0 λe−λx dx = −e−λx 0 = 1 − e−λa


3.) P (X > a) = 1 − P (X ≤ a) = 1 − P (X < a) = 1 − (1 − e−λa ) = e−λa

4.)

P [(X > a) ∩ (X > a + b)]


P [(X > (a + b) | (X > a)] =
P (X > a)
P (X > a + b) e−λ(a+b)
P [(X > (a + b) | (X > a)] = = = e−λb
P (X > a) e−λa
P [(X > (a + b) | (X > a)] = P (X > b) (5.6)

Thus

P [(X > 75)|(X > 30)] = P (X > 45)


P [(X > 1200)|(X > 850)] = P (X > 350)

Also

P [(X > q)|(X > p)] = P (X > (q − p)), q > p (5.7)

jnn & ok elements of probability & statistics Page 144 of 219


5.3. THE EXPONENTIAL DISTRIBUTION

Example 5.3.1 Suppose that the length of a phone call in minutes is an exponential random
1
variable with parameter λ = . If Someone arrives immediately ahead of a public telephone
10
booth, find the probability that you will have to wait

1.) more than 10 minutes



1
 1 e− 10 x ; x ≥ 0



f (x) = 10



 0 ; x<0

Z∞ 1
1 − x
P (X > 10) = e 10 dx
10
10
∞
1

− x
= −e 10 
10
−1
= e
= 0.3679

2.) between 10 and 20 minutes


1
20
1 − x
Z
P (10 < X < 20) = e 10 dx
10 10
20
1

− x
= −e 10 
 10
= e−1 − e−2


= 0.233

Example 5.3.2 The waiting time for the Bsc External students to their next class is exponen-
tially distributed with λ = 5. Find the probability that any student of Bsc External will wait
for less than 4 hours?
 −5x
 5e ; x≥0
f (x) =
0 ; x<0

Z 4
⇒ P (X < 4) = 5e−5x dx
0
 −5x 4
= −e 0
= 1 − e−20
= 0.999999997

jnn & ok elements of probability & statistics Page 145 of 219


5.3. THE EXPONENTIAL DISTRIBUTION

5.3.1 The Cumulative Distribution Function of Exponential Distri-


bution
We define the cumulative distribution function, F (x), of an exponentially distributed random
variable X by

F (x) = 1 − e−λx , x ≥ 0 (5.8)

By definition;
Z x Z x x
λe−λt dt = −e−λx 0 = 1 − e−λx , x ≥ 0

F (x) = f (t)dt =
−∞ 0

5.3.2 The Expectation of an Exponential Distribution


Definition 5.3.2 If a random variable has an exponential distribution, then the mean of X
1
E(X) = (5.9)
λ
Proof : Applying the definition of expectation and integrating by parts technique;
Z ∞ Z ∞
E(X) = xf (x)dx = λxe−λx dx
−∞
Z ∞ 0
E(X) = −xe−λx + e−λx dx
0
 ∞
−λx 1 −λx
= −xe − e
λ 0
1
=
λ

5.3.3 The Variance of an Exponential Random Variable


Definition 5.3.3 If a random variable X has an exponential distribution, then the variance
of X
1
Var(X) = (5.10)
λ2
Proof :
Z ∞ Z ∞
2
E(X ) = 2
x f (x)dx = λx2 e−λx dx
−∞ 0
 ∞
1 2 2
E(X 2 ) = λ − x2 e−λx − 2 xe−λx − 3 e−λx
λ λ λ 0
2
=
λ2
Thus
2 1 1
Var(X) = E(X 2 ) − [E(X)]2 = − =
λ2 λ2 λ2

jnn & ok elements of probability & statistics Page 146 of 219


5.3. THE EXPONENTIAL DISTRIBUTION

Example 5.3.3 The lifetime of a particular type of bulb has an exponential distribution with
mean lifetime of 1000 hours.

1.) Find the probability that a bulb is still working after 1300 hours.

Let X = the lifetime of a light bulb in hours. Then

f (x) = λe−λx , x ≥ 0.

Now
1
E(X) =
λ
And given E(X) = 1000 then
1
= 1000 ⇒ λ = 0.001.
λ
And so,
f (x) = 0.001e−0.001x
P (X > 1300) = e−1.3 = 0.2725

2.) Given that it is still working after 1300 hours, find the probability that it is still working
after 1500 hours.

P [(X > 1500) | (X > 1300)] = P (X > 200) Exponential identity 5.7
= e−0.001×200
= e−0.2
= 0.819

3.) Find the standard deviation of the lifetime of this type of light bulb.
r
1 1
Standard Deviation = 2
= = 1000hours
λ λ
Example 5.3.4 Suppose that the amount of time one spends in a bank is exponentially dis-
tributed with mean 10 minutes.

1.) What is the probability that a customer will spend more than 15 minutes in the bank?

P (X > 15) = e−15λ = e−3/2 = 0.22

2.) What is the probability that a customer will spend more than 15 minutes in the bank given
that he is still in the bank after 10 minutes?

P (X > 15 | X > 10) = P (X > 5) = e−1/2 = 0.6065

By the special Exponential identity Equation 5.6 or Equation 5.7


1
Note 5.3.1 Did you realize for Example 5.3.4 that λ = ?, but why?
10

jnn & ok elements of probability & statistics Page 147 of 219


5.3. THE EXPONENTIAL DISTRIBUTION

Example 5.3.5 The time required to repair a machine is an exponential random variable
with rate λ = 0.5 downs/hour.

1.) What is the probability that a repair time exceeds 2 hours?

P (T ≥ 2) = e−1 = 0.36788

2.) What is the probability that the repair time will take at least 4 hours given that the repair
man has been working on the machine for 3 hours?

P (T ≥ 4 | T ≥ 3) = P (T ≥ 1) = e−0.5 = 0.60653

By the special Exponential identity Equation 5.6 or Equation 5.7

Example 5.3.6 Buses arrive to a bus stop according to an exponential distribution with rate
λ = 4 buses/hour.

1.) If you arrived at 8:00 am to the bus stop, what is the expected time of the next bus?

Next = 8 : 00 + Expected waiting time


Next = 8 : 00 + E[T ]
1
= 8 : 00 + hour
λ
1
= 8 : 00 + hour
4
= 8 : 00 + 15 min
= 8 : 15 am

2.) Assume you asked one of the people waiting for the bus about the arrival time of the last
bus and he told you that the last bus left at 7:40 am. What is the expected time of the
next bus?
1
= 8 : 00 + hour
4
= 8 : 00 + 15 min
= 8 : 15 am

Exercise 5.6 On the average, a certain computer part lasts 10 years. The length of time the
computer part lasts is exponentially distributed.

1.) What is the probability that a computer part lasts more than 7 years? 0.4966

2.) On the average, how long would 5 computer parts last if they are used one after another?
50 years

3.) Eighty percent of computer parts last at most how long? 16.1 years

4.) What is the probability that a computer part lasts between 9 and 11 years? 0.0737

jnn & ok elements of probability & statistics Page 148 of 219


5.4. GAMMA DISTRIBUTION

5.4 Gamma distribution


Definition 5.4.1 We define the Gamma function as
Z ∞
Γ(α) = xα−1 e−x dx < ∞ ∀x > 0 ∀α > 0
0

properties of Γ(α)
Γ(α) = (α − 1)Γ(α − 1) , α > 1
Γ(α) = (α − 1)! α > 1

 
1
Γ = π, thus
2
      
1 1 3 1 1
Γ α+ = α− α− ... Γ
2 2 2 2 2
1.3.5. . . . (2α − 1) 1
= Γ( )
2α 2
Consider the function Z ∞
G(α, β) = xα−1 e−βx dx β > 0 α > 0
0
Let U = βx ⇒ du = βdx
so
 α−1 Z ∞
u du
G(α, β) = e−u
0 β β
Z ∞
1
= uα−1 e−u du
βα 0
Γ(α)
=
βα
Thus ∞
βα
Z
G(α, β) = 1
0 Γ(α)

Definition 5.4.2 It follows that the function


β α α−1 −βx
f (x; α, β) = x e β>0 α>0
Γ(α)
is a probability density function. A random variable with f (x; α, β) as its probability density
function is said to have a Gamma distribution with parameters α and β.
Remark 5.4.1 In other words
Z ∞
Γ(α)
xα−1 e−βx dx =
0 βα
A consequence of the above is that we can easily find values of gamma integrals.
Exercise 5.7 Z ∞
1 Γ(4)
x3 e− 2 x dx =  = 16 · 3! = 96
1 4
0
2

Exercise 5.8 Z ∞
Γ(5) 24
x4 e−θx dx = 5
= 5
0 θ θ

jnn & ok elements of probability & statistics Page 149 of 219


5.4. GAMMA DISTRIBUTION

5.4.1 Expectation and Variance of a Gamma Distribution


Let X ∼ Γ(α, β) i.e the probability density function of X is
β α α−1 −βx
f (x; α, β) = x e x > 0, α > 0
Γ(α)
Therefore, the expectation of a Gamma distribution is
Z ∞
βα β α Γ(α + 1) α
E(X) = xα e−βx dx = α+1
= (5.11)
Γ(α) 0 Γ(α) β β
Using

βα
Z
α(α + 1)
2
E(X ) = xx+1 e−βx dx =
Γ(α) 0 β2
The variance of a Gamma distribution is given by
α
V (X) = E X 2 − [E (X)]2 =

(5.12)
β2
Moreover

βα β α Γ(α + k)
Z
k
E(X ) = xα+k−1 e−βx dx = , k∈Z
Γ(α) 0 Γ(α) β α+k
But
Γ(α + k) = (α + k − 1)(α + k − 2) . . . αΓ(α)
Therefore
(α + k − 1)(α + k − 2) . . . α
E(X k ) = (5.13)
βk

Exercise 5.9 Use the above result to find the third central moment of X.

5.4.2 Gamma and the Exponential Distribution


If α = 1, then Gamma(1, β) ≡ Exponential(β), that is
f (x, β) = βe−βx β>0 x>0 (5.14)

5.4.3 Gamma and the Chi-Squared Distribution


n 1
If α = and β = the Gamma density takes the form
2 2
  n2
1
n 1 2
f (x, n) = x 2 −1 e− 2 x ·  n  (5.15)
Γ
2

Definition 5.4.3 A random variable with probability density function f (x, n) in Equation
(5.15) is said to have a Chi-Squared χ2 distribution with n degrees of freedom. We write
X ∼ χ2(n) . Verify that
E(X) = n
Var(X) = 2n

jnn & ok elements of probability & statistics Page 150 of 219


5.5. BETA DISTRIBUTION

5.5 Beta Distribution


A continuous random variable X is said to have a beta distribution with parameters α and β
(α > 0 and β > 0) if the probability density function of X is written as

Γ(α + β) α−1
x (1 − x)β−1 , 0 < x < 1


Γ(α)Γ(β)

f (x, α, β) =


0 otherwise

To verify that f (x, α, β) is a probability density function, we show that


Z 1
Γ(α)Γ(β)
xα−1 (1 − x)β−1 dx = (5.16)
0 Γ(α + β)

5.5.1 Expectation and Variance of a Beta Distribution


The moments of the beta distribution are easy to calculate. Using the definition of the k-th
moment, we have
Z 1
k
E(X ) = xk f (x, α, β)
0
Γ(α + β) 1 k α−1
Z
= x x (1 − x)β−1 dx
Γ(α)Γ(β) 0
Γ(α + β) 1 k+x−1
Z
= x (1 − x)β−1 . (5.17)
Γ(α)Γ(β) 0

Comparing (5.17) with (5.16), we get,

Γ(α + β) Γ(α + k)Γ(β)


E(X k ) =
Γ(α)Γ(β) Γ(α + β + k)
Γ(α + β)Γ(α + k)
=
Γ(α)Γ(k + α + β)
α(α + 1) . . . (α + k − 1)
= .
(α + β)(α + β + 1) . . . (α + β + k − 1

It then follows that


α
E(X) =
α+β
V(X) = E X 2 − [E(X)]2

 2
α(α + 1) α
= −
(α + β)(α + β + 1) α+β
αβ
= 2
(α + β) (α + β + 1)

jnn & ok elements of probability & statistics Page 151 of 219


5.6. OTHER IMPORTANT DISTRIBUTIONS

5.6 Other Important Distributions


5.6.1 The Student’s T-Distribution
The student’s t-distribution is very closely related to the normal distribution. They are both
symmetric about the mean zero and bell-shaped.

Sometimes, we find that the variance of the population from which we are sampling is not
known. For samples of size n ≥ 30, a good estimate for σ 2 is S 2 , the sample variance. Then,
by estimating σ 2 with S 2 , the Z-value in the Central Limit Theorem is still approximately
normally distributed and
X̄ − µ
Z= √
S/ n
If n < 30, the values of S 2 fluctuate considerably from sample to sample and as such by
estimating σ 2 with S 2 , the value of Z are no longer normal. Then we refer to this distribution
as the t−distribution.
Definition 5.6.1 If X̄ and S 2 are the mean and variance, respectively, of a random sample of
size n taken from a population that is normally distributed with mean µ and variance σ 2 then
X̄ − µ
t = √ (5.18)
S/ n
is a value of a random variable T having the t-distribution with v = n − 1 degrees of freedom.
The pdf of the t-distribution is defined by
 1 (ν−1)
x2 2

f (x) = Cv 1 + ; −∞ < x < ∞ (5.19)
v
where v is the degree of freedom and cν depends on ν.

5.6.2 The Chi-Square Distribution


Definition 5.6.2 The probability density function for a chi-square can be deduced from that
of a gamma distribution (α = n2 and β = 2) and it is written as
1 n x
f (x, n) = n
 n x 2 −1 e− 2 ; x > 0. (5.20)
Γ 2
2 2

5.6.3 The F-Distribution


The Fisher’s (F)-distribution is the ratio of two independent χ2 -distribution each divided by
its degrees of freedom. Then the degrees of freedom of the two χ2 distributions become the
degrees of freedom of the F -distribution.

It is important to note that: The total area under an F -curve is 1, An F -curve starts from
0 and then behaves like a χ2 -curve, An F -curve is not symmetric but is skewed to the right.
From the above, we note that the F -distribution has an f -value defined by
χ21 /ν1 S12 /σ12 σ22 S12
f = = = (5.21)
χ22 /ν2 S 2 /σ22 σ12 S22
where χ2 is a value of a chi-square distribution with ν1 = n1 − 1 degrees of freedom and X22 is
a value of a chi-square with ν2 = n2 − 1 degrees of freedom and we write f (ν1 , ν2 ).

jnn & ok elements of probability & statistics Page 152 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

5.7 Probability Distribution Functions Chapter Exam-


ples
5.7.1 Discrete Distributions Overview
Notation f (x) E[X] Var[X]

Bernoulli b(p) px (1 − p)1−x p p(1 − p)


 
n
Binomial b(n, p) px (1 − p)n−x np np(1 − p)
x
  

m 
m−x 
x n−x nm nm(N − n)(N − m)
Hypergeometric Hyp(N, m, n)  

N 
N N 2 (N − 1)
x

1−p 1−p
 
n−1
Negative Binomial N b(n, r, p) pr (1 − p)n−r r r
r−1 p p2

1 1−p
Geometric Geo(p) p(1 − p)x−1 x ∈ N+
p p2

e−λ λx
Poisson Po(λ) λ λ
x!

For different values of parameters, the different curves of the distributions


Binomial Geometric
0.35 0.8 Poisson
n=40, p=0.3 p=0.2 0.4
h =1
n=30, p=0.8 p=0.5
0.3 0.7 h=4
n=25, p=0.9 p=0.8 0.35
h = 10
0.6
0.25 0.3

0.5 0.25
0.2
pmf

pmf

pmf

0.4 0.2
0.15
0.3 0.15

0.1
0.2 0.1

0.05
0.1 0.05

0 0 0
0 10 20 30 40 0 2 4 6 8 10 12 0 5 10 15 20
x x x

Figure 5.11: Binomial, Geometric and Poisson curves

indicate that some distribution can be symmetric at some specific values, but some, say the
Geometric fail to become bell shaped.

jnn & ok elements of probability & statistics Page 153 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

5.7.2 Continuous Distributions Overview

Notation f (x) E[X] Var[X]

1 a+b (b − a)2
Uniform Unif(a, b)
b−a 2 12

(x − µ)2
 
2 1
Normal N (µ, σ ) √ exp − µ σ2
σ 2π 2σ 2

−(ν+1)/2 (
Γ ν+1
  ν
2 x2 ν−2
; ν > 2,
Student’s t Student(ν) √  1+ 0
νπ Γ ν2 ν ∞ ; 1 < ν ≤ 2.

1 k
−1
 x 1−p 1−p
Chi-square χ2k k
 x 2 exp − r r
2k/2 Γ 2
2 p p2

s
(d1 x)d1 dd22
(d1 x + d2 )d1 +d2 d2 2d22 (d1 + d2 − 2)
F F(d1 , d2 )  
d1 d1 d2 − 2 d1 (d2 − 2)2 (d2 − 4)
xB ,
2 2

1 1
Exponential Exp(λ) λe−λx
λ λ2

1 x
α−1 − β
Gamma Gamma(α, β) x e αβ αβ 2
Γ(α)β α

λα α−1 −λx α α
Gamma(α, λ) x e
Γ(α) λ λ2

Γ(α + β) α−1 α αβ
Beta Beta(α, β) x (1 − x)β−1
Γ(α)Γ(β) α+β (α + β)2 (α + β + 1)

   
k  x k−1  x k 1 2
Weibull Weibull(λ, k) exp − λΓ 1 + λ2 Γ 1 + − µ2
λ λ λ k k

xαm αxm xαm


Pareto Pareto(xm , α) α x ≥ xm α>1 α>2
xα+1 α−1 (α − 1)2 (α − 2)

jnn & ok elements of probability & statistics Page 154 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

For the Matlab codes used to generate the distribution graphs above, check the file “mat
distributions.m” in the appendix or the prob zip folder.

r2 Normal
2 1
k=1 µ = 0, m2 = 0.2
1.8 k=2 0.9 µ = 0, m2 = 1
k=3
µ = 0, m2 = 5
1.6 k=4 0.8

k=5 µ = ï2, m2 = 0.5


1.4 0.7

1.2 0.6

pdf
pdf

1 0.5

0.8 0.4

0.6 0.3

0.4 0.2

0.2 0.1

0 0
0 1 2 3 4 5 6 7 8 ï6 ï4 ï2 0 2 4 6
x x
Student’s t Beta
5
0.4
i=1 _ =0.3,` = 0.3
i=2 4.5 _ =5,` = 1
0.35 _ = 1,` = 3
i=5
i=' 4 _ = 2,` = 2
0.3 _ = 2,` = 5
3.5

0.25
3
pdf
pdf

0.2 2.5

2
0.15

1.5
0.1
1

0.05
0.5

0 0
ï6 ï4 ï2 0 2 4 6 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
x x

Figure 5.12: Chi-squarea, Normal, Student’s t and Beta continuous distribution

Exercise 5.10 Each time customers visit a restaurant they are given a game card. Suppose
the probability of winning a prize with the game card is 0.2. Let X represent the number of
visits to a restaurant before winning a prize with the game card. What is the probability that
a customer will win a prize for the first time on the 6th visit? [0.06554]

Exercise 5.11 An oil company conducts a geological study that indicates that an exploratory
oil well should have a 0.20 chance of striking oil. What is the probability that the third strike
comes on the seventh well drilled? [0.0492]

jnn & ok elements of probability & statistics Page 155 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES
Exponential Gamma
2.5
h = 0.5 0.5
_ =1,` = 2
h=1
0.45 _ =2,` = 2
h = 2.5
_ = 3,` = 2
2
0.4 _ = 5,` = 1
_ = 9,` = 0.5
0.35

1.5 0.3
pdf

pdf
0.25

1 0.2

0.15

0.5 0.1

0.05

0 0
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 0 2 4 6 8 10 12 14 16 18 20
x x

Figure 5.13: Exponential and Gamma continuous distribution

Example 5.7.1 For a Uniform Distribution in [0, 1].


 
 0 ; if x < 0  0 ; if x < 0
F (x) = x ; if 0 ≤ x ≤ 1 , ρ(x) = 1 ; if 0 ≤ x ≤ 1
1 ; if x > 1 0 ; if x > 1
 
Z ∞ Z 1
1
E(X) = xρ(x) dx = x dx = ,
−∞ 0 2
Z 1  2
2 1 1 1 1
Var(X) = x dx − = − = .
0 2 3 4 12

Note 5.7.1 P (X = x) = f (x) = ρ(x)

Example 5.7.2 For a Normal (Gaussian) Distribution, Mean µ, Variance σ 2 .

(x − µ)2
 
1
ρ(x) = √ exp − ,
σ 2π 2σ 2
Z x
(t − µ)2
 
1
F (x) = √ exp − dt
σ 2π −∞ 2σ 2

Example 5.7.3 The cumulative distribution function cf d of the Gamma distribution with
parameter α > 0, λ > 0. Z x
λα
F (x, α, λ) = tα−1 e−λt dt.
Γ(α) 0

jnn & ok elements of probability & statistics Page 156 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

Exercise 5.12 Suppose you are watching a radioactive source that emits particles at a rate
described by the exponential density

f (t) = λe−λt , t ≥ 0

where λ = 1, so that the probability P (0, T ) that a particle will appear in the next T seconds
RT
is P ([0, T ]) = 0 λe−λt dt. Find the probability that a particle (not necessarily the first) will
appear
1.) within the next second.

2.) within the next 3 seconds.

3.) between 3 and 4 seconds from now.

4.) after 4 seconds from now.

Exercise 5.13 Assume that a new light bulb will burn out after t hours, where t is chosen
from [0, ∞) with an exponential density

f (t) = λe−λt .

In this context, λ is often called the failure rate of the bulb.


1.) Assume that λ = 0.01, and find the probability that the bulb will not burn out before T
hours. This probability is often called the reliability of the bulb.

2.) For what T is the reliability of the bulb = 1/2?

Exercise 5.14 Use the Binomial Probabilities to find the probability that, in 100 tosses of a
fair coin, the number of heads that turns up lies between 35 and 65, between 40 and 60, and
between 45 and 55.[Hint: For large n, n ≥ 30, apply Normal approximation to Binomial ]

Exercise 5.15 Charles claims that he can distinguish between beer and ale 75 percent of the
time. Ruth bets that he cannot and, in fact, just guesses. To settle this, a bet is made: Charles
is to be given ten small glasses, each having been filled with beer or ale, chosen by tossing a fair
coin. He wins the bet if he gets seven or more correct. Find the probability that Charles wins
if he has the ability that he claims. Find the probability that Ruth wins if Charles is guessing.

Exercise 5.16 A die is rolled 30 times. What is the probability that a 6 turns up exactly 5
times? What is the most probable number of times that a 6 will turn up?

Exercise 5.17 Find integers n and r such that the following equation is true:
       
13 13 13 n
+2 + = .
5 6 7 r
Exercise 5.18 In a ten-question true-false exam, find the probability that a student gets a
grade of 70 percent or better by guessing. Answer the same question if the test has 30 questions,
and if the test has 50 questions.
Exercise 5.19 A restaurant offers apple and blueberry pies and stocks an equal number of
each kind of pie. Each day ten customers request pie. They choose, with equal probabilities,
one of the two kinds of pie. How many pieces of each kind of pie should the owner provide so
that the probability is about 0.95 that each customer gets the pie of his or her own choice?

jnn & ok elements of probability & statistics Page 157 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

Exercise 5.20 A poker hand is a set of 5 cards randomly chosen from a deck of 52 cards. Find
the probability of a
1.) royal flush (ten, jack, queen, king, ace in a single suit).
2.) straight flush (five in a sequence in a single suit, but not a royal flush).
3.) four of a kind (four cards of the same face value).
4.) full house (one pair and one triple, each of the same face value).
5.) flush (five cards in a single suit but not a straight or royal flush).
6.) straight (five cards in a sequence, not all the same suit). (Note that in straights, an ace
counts high or low.)

Example 5.7.4 A fair coin is tossed until the second time a head turns up. The distribution
for the number of tosses is u(x, 2, p).
x − 1 k x−k x−1 1
     
1
u(x, k, p) = p q ⇒ u x, 2, =
k−1 2 1 2x
for x = 2, 3, . . . . A Negative Binomial problem.
Example 5.7.5 Two players A and B flip a biased coin alternately and the first player to
obtain a head wins. The probability of obtaining a head is p > 0 at each toss. If A flips first,
find the probability that A wins the game.

The event of A wins the game can be decomposed into



[
A win = {A wins on the i-th toss}.
i=1

Then we have
p
P (A wins ) = p + (1 − p)2 p + (1 − p)4 p + · · · =
1 − (1 − p)2

Example 5.7.6 A fair coin is tossed independently n times (n > 3). Find the probability at
least three of tosses yield heads.
   
1 n 1 n 1
P ( at least 3 H’s) = 1 − P ( at most 2 H’s) = 1 − n − −
2 1 2n 2 2n

Example 5.7.7 Find a value c that minimizes E [(X − c)2 ] for a discrete random variable X.

Let
f (c) = E(X − c)2 = c2 − (2E[X])c + E[X 2 ]
Then f (c) is minimum at a point where f ′ (c) = 0. f ′ (c) = 2c − 2E[X] = 0. So

c = E[X]

will give the minimum.

jnn & ok elements of probability & statistics Page 158 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

Example 5.7.8 Suppose the probability of containing at least one typographical error in a
page of a book is 0.005. Assuming typographical errors occur independently from page to page,
what is the probability that a 400 page novel will contain exactly one page with errors?

It’s a Poisson distribution,


e−2 · 21
P (X = 1) ≈ = 0.2707
1!

Example 5.7.9 The length of time required to complete a college test is found to be normally
distributed with mean 50 minutes and standard deviation 12 minutes.
1.) When should the test be terminated if we wish to allow sufficient time for 90% of the
students to complete the test?
X − 50
Let X be the length of time to complete the test. Then Z = ∼ N (0, 1), the
12
standard normal distribution. Need to find c such that
c − 50 c − 50
 
P (X < c) = P Z < = 0.9 ⇔ = 1.28
12 12
So at least c = 65.36 minutes should be given.

2.) What proportion of students will finish the test between 30 and 60 minutes?

P (30 < X < 60) = P (−1.67 < Z < 0.83) = 0.75

Example 5.7.10 Let X have the probability density function


(
ce−x ; 1 < x < ∞,
f (x) =
0 ; otherwise.

1.) Find the value c. Z ∞



ce−x dx = −ce−x 1
= ce−1 = 1 ⇔ c = e
1

2.) Find the cumulative distribution function of X and plot its graph.
Integrating f , we get (
0 ; x ≤ 1,
F (x) =
1 − e1−x ; x > 1.
F is a non-decreasing function and F (∞) = 1.

3.) Find P (−1 < X ≤ 3|X ≥ 3).

P (X = 3)
P (−1 < X ≤ 3|X ≥ 3) = =0
P (X ≥ 3)
Since the probability of a continuous random variable taking a single point is zero.

Exercise 5.21 Bob is a high school basketball player with probability of making a free throw
as 0.70. During the season, what is the probability that Bob makes his third free throw on his
fifth shot?

jnn & ok elements of probability & statistics Page 159 of 219


5.7. PROBABILITY DISTRIBUTION FUNCTIONS CHAPTER
EXAMPLES

Example 5.7.11 The speed of a molecule in a uniform gas at equilibrium is a random variable
whose probability density function is given by
 2 −bx2
ax e , x≥0
f (x) =
0, x<0
m
where b = κT and κ, T , and m denote, respectively, Boltzmann’s constant, the absolute
2
temperature of the gas, and the mass of the molecule. Evaluate a in terms of b.
Z ∞
4b3/2 1 2 1
a = √ , using integration by parts and the fact √ e−y /2 dy =
π 2π 0 2

Example 5.7.12 Suppose X is a random variable following a normal distribution with vari-
ance σ 2 , where σ is the standard deviation of X. Show that the standard deviation of −3X + 2
is 3σ. p √
SD(−3x + 2) = Var(−3x + 2) = 9σ 2 + 0 = 3σ

Exercise 5.22 The standard deviation of X, denoted SD(X), is given by


p
SD(X) = Var(X)

Find SD(aX + b) if X has variance σ 2 . a2 σ 2 = |a|σ

Example 5.7.13 Let f (x) denote the probability density function of a normal random variable
with mean µ and variance σ 2 . Show that µ−σ and µ+σ are points of inflection of this function.
That is, show that f ′′ (x) = 0 when x = µ − σ or x = µ + σ.

1
Let c = √ , such that
2πσ
2 /2σ 2 2 /2σ 2 2 /2σ 2
f (x) = ce−(x−µ) ⇒ f ′′ (x) = cσ −4 e−(x−µ) (x − µ)2 − cσ −2 e−(x−µ)

such that f ′′ (µ − σ) = f ′′ (µ + σ) = cσ −2 e−1/2 − cσ −2 e−1/2 = 0

Exercise 5.23 Verify that the gamma density function integrates to 1.


1
Exercise 5.24 If X is an exponential random variable with mean , show that
λ
k!
E X k = k ; k = 1, 2, . . .
 
λ
 
Hint: Make use of the gamma density function to evaluate E X k .

jnn & ok elements of probability & statistics Page 160 of 219

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy