0% found this document useful (0 votes)
16 views34 pages

Basic CH 4

Chapter 4 introduces fundamental concepts of probability and probability distributions, including definitions of experiments, sample spaces, events, and various counting techniques. It explains the classical approach to probability, the axioms governing probability, and provides examples to illustrate these concepts. Additionally, it discusses rules of probability, including the calculation of probabilities for mutually exclusive and collectively exhaustive events.

Uploaded by

gemechisgirma65
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views34 pages

Basic CH 4

Chapter 4 introduces fundamental concepts of probability and probability distributions, including definitions of experiments, sample spaces, events, and various counting techniques. It explains the classical approach to probability, the axioms governing probability, and provides examples to illustrate these concepts. Additionally, it discusses rules of probability, including the calculation of probabilities for mutually exclusive and collectively exhaustive events.

Uploaded by

gemechisgirma65
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Chapter 4: Probability and probability distributions

The aim of the chapter is to familiarize you with fundamental probability theory and application
on it. A probability is a qualitative measure of being or un-being (uncertainty). It is a number
between zero and one that conveys the strength of our belief in the occurrence of uncertain
event.

4.1 Basic terminologies

a. Experiment: - A process which leads to the occurrence of one of several possible


observations.
b. Sample space:-It is the set of possible outcomes of a probability experiment.
- The outcomes forming the sample space must be mutually exclusive and exhaustive.
By mutually exclusive, which is meant that no two of the outcomes can both occur on a given
experiment at the same time.

For instance, consider the sample space for the experiment of tossing a coin. The two possible
outcomes will be Head or Tail. These two outcomes are mutually exclusive because Head and
Tail cannot both occur at the same time on a given experiment. And they are also exhaustive
since they are all possible results of the toss. Therefore sample space is the set of all possible
outcomes of an experiment or it is a universal set which is denoted by S, and pertinent to a given
experiment. Similarly in the experiment consisting of two successive tosses of the coin, the
possible outcomes the so called sample space will be a Head occurring on both tosses, a Tail
occurring on both tosses and then Head and Tail and a Tail and Head order are all possible
outcomes. In all, then, we have four sample outcomes. Symbolically; let H stand for Head, T for
Tail, then we have the following sample space: i.e. HH; HT; TH; and TT. Here, only one of four
outcomes could occur on a given experiment. they are mutually exclusive and exhaustive since
they represent all possible outcomes of the experiment.

Tree diagrams:-

It is one of the useful methods to depict how to generate the sample outcomes of a sample
space. It helps to ensure us in identifying all possibilities. An example of a tree diagram for
determining the sample space for a two coin toss experiment is shown below.

First coin toss second coin toss H

Note that the set of all possible outcomes of an experiment is called a population or a sample
space.

1
C) Sample point:-

An individual outcome of the sample space (or population) is called a sample point. From the
above example, of tossing a fair coin twice, the outcomes HH, HT, TH, and TT, are individually
a sample point.

D) Out comes:-

It is a particular result of an experiment. Or it is a result of a chance situation, which is a list of


all possible outcomes that cover every possibility exhaustively and not even a single outcome
over-laps any of the others which is called mutually exclusive.

e) Events:- It is a collection of one or more outcomes of an experiment. Or it is a group of


outcomes. For instance the number of points rolled with the die is divided by 3.

Since the sample space of this experiment is S= {1, 2, 3, 4, 5, 6} and then the event will be, E =
{3, 6}

f) Probability of an event:- It is the ratio of the sum of the number of favorable outcomes to the
total number of possible outcomes. Or probability of an event = (number of favorable
outcomes/ total number of possible outcomes).

Example The experiment of tossing two coins:

 Sample space = S = {HH, HT, TH, TT}


 The sample of one head = EH1 ={HT, TH}
 Probability of one head = ¼ + ¼ = 2/4 = ½
g) Mutually exclusive events:- The occurrence of any one event means that none of the others
can occur at the same time

Example Tossing two coins:

 The possible outcomes is called the sample space = S = {HH, HT, TH, TT}
 The probability of getting one outcome is EHH = ¼; EHT = ¼; ETH =¼; ETT = ¼
Therefore the four possible outcomes are mutually exclusive. That means that at the time of the
occurrence of HH any other outcome will not occur and the same applies for other procedures.

h) Equally likely events:- It is a situation where the occurrence of one event is as likely as the
other event. For instance, in a toss of a single coin, a tail is as likely to occur as a head.

i) Collectively exhaustive events:- It is the event where occurrences exhaust all possible
outcomes of an experiment. In a rolling of a single die since 1, 2, 3, 4, 5 and 6 are the only
possible outcomes, they are collectively exhaustive events.

Counting techniques:

2
In order to determine the number of out comes one can use several rules of counting.

1. Multiplication rule: - in a sequence of n events in which the first event has k1 possibilities,
the second event has k 2 possibilities,…, the nth event has kn possibilities, then the total
possibilities of the sequence will be k1.k2….kn.

Example: - in a personnel department a larger corporation wishes to issue each employee an ID


cards with two letters followed by two digit numbers. How many possible ID cards can be
imposed?

Solution

K1 K2 K3 K4

26 26 10 10

Thus the total number of ID cards issued could be:

26*26*10*10=67600(with repetition)

26*25*10*9=58500 (with out repetition)

2. Permutation: is an arrangement of n objects in a specific order. In this case order is crucial.

a) The number of permutations of n objects taken all together is n! i.e. n! / (n-n)!

b) The arrangement of n distinct objects in a specific order taking r objects at a time is given by

nPr =n!/(n-r)!= n(n-1)(n-2)…..(n-r-1)

c) The number of permutation of n objects in which k1 are alike, K2 are alike, kn are alike is

n! / k1!k2!....kn!

Example: a photographer wants to arrange 3 persons in a raw for photograph. How many
different types of photographs are possible?

Solution:

Assume 3 persons Aster (A), lemma (L), Yared (Y) and n=3

3
Since n! =3! = 3*2! = 6, there are 6 possible arrangement ALY, AYL, LAY, LYA, YLA and
YAL

Example2: fifteen athletes including Haile were entered to the race.

a) In how many different ways could prizes for the first, the second and the third place be
awarded?

b) How many of the above triplets just counted have if Haile is in the first position?

Solution:

a) 15 objects taken 3 at a time 15P3=15! / (15-3)! = 2730


b) There are 14P2= 14! / (14-2) = 182

3. Combination: - counting technique in which the order of the objects is immaterial. Selection
of r objects from a collection of n objects where r<= n without regarding order. The
combination of n objects r objects taken at a time is given by

nCr = n! / (n-r)! r!

Example: In a club containing 7 members a committee of 3 people is to be formed. In how many


ways can the committee be formed?

Solution: 7C3 = 7! / (7-3)! 3! = 35

Basic approaches to probability

Classical approach: - Uses sample space to determine the numerical probability that an event
will happen. If there are n equally likely outcomes of an experiment, and out of the n outcomes
event E occur only k times the probability of the event E is denoted by P (E) is defined as

P (E) = n (E)/ n(S) =k/n

Deficiencies of classical approach

4
- If total number of outcomes is infinite or if it is not possible to enumerate all elements of
the sample space.
- If each out come is not equally likely
Example: in the experiment of tossing a coin and a die together, find the probability of an event
E consisting head and even numbers.

Solution: S={H1,H2,H3,H4,H5,H6,T1,T2,T3,T4,T5,T6} then

E= {H2, H4, H6} thus, P (E) =n (E)/n(S) =3/12=1/4

Let S be sample space of an experiment, P is called probability function if it satisfies the


following condition

0 < P (A) ≤ 1, for each event A, P (A) is called probability of A

Where P (S) = 1

Note: If A and B are mutually exclusive events, then P (A  B) = P (A) + P (B)


Similarly P (  Ai ) =P ( A1 ) + P ( A2 ) +…+ P ( An )
i 1


=  P( A )
i
i 1

Relative frequency Approach (empirical approach):- suppose we repeat a certain


experiment n times and let A be an event of the experiment and let k be the number of times that
event A occurs. Then the ratio k/n is called the relative frequency of event A.
numberof times event A has occurred k
P( A)  
total number of observations n

In other words given a frequency distribution, the probability of an event (E) being in a
frquency of a class
given class is P (E)=
total frequency in the distribution

5
Example: the national center for health statistics reported out of every 539 deaths in recent years,
24 resulted from automobile accident, 182 from cancer, and 353 from other disease. What is the
probability that particular death is due to an automobile accident?

Solution P (automobile) = death due to automobile /total death =24/539

Rules of probability

Rule l: let A be an event and A’ be the compliment of A with respect to a given sample space
of an experiment, then p(A’)=1-P(A)

Proof:

let S be a sample space

 S=A  A’

 A  A’ =   P ( A n A’)=0

 P(S) = P (A  A’) = P (A’) + P (A) - P( An A’)

 1= P (A’) + P (A) - 0  1= P (A’) + P (A)

 P (A’) = 1 - P (A)

Rule 2: let A and B are events of a sample space S, then

P (A’  B) = P (B)-P (A  B)

Proof: B =S  B = (A  A’)  B = (A  B)  (A’  B)

Case 1: if A  B ≠  , then P (B) =P (A  B) +P (A’  B)

P (A’  B) = P (B) – P (A  B)

Case 2: if A  B =  , then P (B) =P (A  B) + P (A’  B) since P (A  B) = P (  ) =0

=> P (B) = P (A’  B)

Rule 3: Suppose A and B are two events of a sample space, then


6
P (A  B) = P (A) + P (B) - P (A  B)

Example: A fair die is thrown twice. Calculate the probability that the sum of spots on the face
of the die that turn up is divisible by 2 or 3.

Solution:

S= {(1,1),(1,2),(1,3),(1,4),(1,5),(1,6),(2,1),(2,2),(2,3),(2,4),(2,5),(2,6),(3,1),

(3,2),(3,3),(3,4),(3,5),(3,6),(4,1),(4,2),(4,3),(4,4),(4,5),(4,6),(5,1),(5,2),(5,3),(5,4),(5,4),(5,5),(5,6),
(6,1),(6,2),(6,3),(6,4),(6,5),(6,6)}

This sample space has 6*6 =36 elements let E1 be the event that the sum of the spots on the die
is divisible by 2 and E2 be the event that the sum of the spots on the die is divisible by 3, then

P (E1 or E2) = P (E1  E2)

= P (E1) +P (E2) – P (E1  E2)

= 18/36 + 12/36 -6/36 = 24/36 = 2/3

4.2 Axioms and rules of probability


The probability of an event
The probability of an event always lies between 0 and 1 inclusive. Thus, when an event
does not occur, its probability is zero. For instance the probability of an empty set is zero or the
probability of getting hen from a snake’s egg is zero, since it is impossible to get a hen from a
snake’s egg. On the other hand, events that are certain to occur have a probability of one. The
probability of the entire (whole) sample space (S) is equal to one. Therefore, probabilities are
values of a set of function, also called a probability measure, for, as we shall see this function
assigns real numbers to the various subsets of a sample space (S). Now let us formulate the
probabilities here which apply only when the sample space is discrete.
Postulate 1. The probability of an event is non negative real number
i.e, P (A) ≥ 0, for any subset A of S.
Postulate 2. The probability of a sample space is equal to one
i.e, P(S) = 1
Postulate 3. If A1, A2, A3, …., is a finite or infinite sequence of mutually exclusive events of S,
then
  
P(A1 U A2 U A3 U…..) = P (A1) + P(A2) + P(A3) + …. = P  Ai    P A 
i
 i 1  i 1

7
Example
If we have four possible outcomes A, B, C and D, that are mutually exclusive, are the following
probabilities permissible?
a) P(A) = 0.12; P(B) = 0.63; P(C) = 0.46, P(D) = - 0.21
Based on the above postulates:
i) P(D) = - 0.21 is not permissible, since it is not a non-negative real number (it
violets postulate 1)
b) P(A) = 9/120; P(B) = 45/120; P(C) = 27/120; P(D) = 46/120
This is not permissible, since it violets postulate two.
i.e, P(S) = P(A U B U C U D) = P (A) + P(B) + P(C) +P(D) ≠ 1
Theorem 1.1 If A is an event in a discrete sample S, then P(A) equals the sum of the
probabilities of the individual out comes comprising A.
Proof: let Y1, Y2, Y3, …., be finite or infinite sequence of outcomes that comprise the event A.

Thus A = Y1 U Y2 U Y3 U …… =  y . Since the individual outcomes the ‘Ys’ are mutually
i
i 1

exclusive, the third postulate of probability gives:

 Py  which completes the proof.



P(A) = P(Y1) + P(Y2) + P(Y3) + …… =
i
i 1

Example
If twice tossed a coin, then the probability of getting at least one T will be :( hint: in statistics
at least one means one or more than)
Solution: S = {HH, HT, TH, TT}, so the probability of each sample point is ¼.
Let A is the event that we will get at least one T, then we get A = {HT, TH, TT} and
P(A) = P(HT) + P(TH) + P(TT) = ¼ + ¼ + ¼ = ¾
Theorem 1.2 If an experiment can result in any one of M different equally likely outcomes, and
if m of these outcomes together constitute event A, then the probability of event A is P(A) =
m/M.
Proof: - let Y1, Y2, …..YM represent the individual outcomes in S, each with probability
1/M.
If A is a union of m of these mutually exclusive outcomes, then P(A) = P(Y1, U Y2, U Y3,
U …… U Ym)= P(Y1) + P(Y2) + PY3) .... + P(Ym) = 1/M + 1/M + 1/M + ….+ 1/M = m/M
Some rules of probability
Theorem 1.3 If A and A1 are complementary events in a sample space S, then P(A1) = 1- P(A).
Poof: - we know that A U A1 = S  according to the properties of complement and mutually
exclusive idea.
Then 1 = P(S)  by postulate 2
= P(A U A1)
= P(A) + P(A1)  by postulate 3
Therefore, P(A) = 1- P(A1)  algebra.

8
Example:
If an event occurs 40% of the time, it does not occur 60% of the time.
Theorem 1.4 The probability of an empty set is zero. i.e, P(ø) = 0 for any sample space S.
Proof: - Since S and ø are mutually exclusive and S U ø = S and S  according to the properties
of ø, then P(S) = P(S U ø)
P(S) = P(S) + P(ø)  by postulate 3
P(S) - P(S) = P(ø) = 0  algebra
Therefore P(ø) = 0
Example

i. The probability of getting four heads from a toss of three coins is zero.
ii. The probability of getting pigeon from snake is zero.

A B
b a c
S

Theo
rem
1.5 If A and B are events in a sample space S, and A  B, then P(A)  P(B).
Proof: - since A  B, we can write mutually exclusive events as follows and, we get:
P(B) = P[A U (A1 n B)]
P(B) = P(A) + P(A1  B)  by postulate 3
P(B) ≥ P(A)  by postulate 1
Example
If twice tossed a coin
i) the probability of a ample space S is:
P(S) = P(HH U HT U TH U TT) = P(HH) + P(HT) + P(TH) +P(TT) = ¼ + ¼ + ¼ +
¼ = 4/4 = 1
ii) the probably of getting two heads:
Let A be an event that contains two heads i.e., A = {HH} therefore A  S, then P (A)
= P (HH) = ¼, hence P(S) ≥ P (A) => 1 ≥ ¼.

Theorem 1.6 0 ≤ P (A) ≤1, for any event A.


Proof: - by theorem 1.5 and facts that ø  A  S for any event in a sample space S, then we
have: P (ø) ≤ P (A) ≤ P (S), and then P (ø) = 0 and P(S) = 1 by theorem (1.4), and postulate 2

9
respectively. Therefore:- 0 ≤ P(A) ≤ 1, but P(A) cannot be greater than one or less than zero, by
postulate 1 and 2 and definition of probability.
Theorem 1.7 If A and B are any two events in a sample space S, then P (A U B) = P(A) + P(B) –
P(A  B)
Proof:- Using the help of Venn-diagram, let a, b and c are probabilities to the mutually exclusive
events A  B, A  B1 and A1  B
P(A U B) = a + b + c
= (a+b) + (c+a) – a
= P(A) + P(B) – P(A  B)
Example:
In a sample of 1000 students, 640 said they had pen, 350 said they had pencil, and 200 said they
had both. If a student is selected at random, the probability that the student has either a pen or a
pencil will be:
Solution: Let S is a sample space, E an event for a pen and R – for pencil then in a Venn-
diagram it could be illustrated as follows:

E R

440 200 150

P (E U R) = P (E) + P(R) - P (E  R)
= 640/1000 + 350/1000 – 200/1000
= 0.64 + 0.35 – 0.20 = 0.79
= or 79%
Theorem 1.8. If A, B and C are any three events in a sample space S, then
P(A U B U C) = P(A) + P(B) + P(C) – P(A  B) – P(A  C) – P(B  C) + P(A  B  C).
Proof:- A U B U C = A U (B U C) and by theorem 2.7, P[A U(B U C)] should be shown P[A U
(B U C)] and P (B U C), then P(A U B U C) = P[A U (B U C)] =
= P(A) + P(B U C) – P[A  (B U C)]  theorem 1.7
= P(A) + P(B) + P(C) – P(B  C) – P[(A  (B U C)]
= P(A) + P(B) + P(C) – P(B  C) – P[(A  B) U (A  C)]
= P(A)+P(B)+P(C)–P(B  C)–{P(A  B)+P(A  C)– P[(A  B  (A  C)] 
Therefore P(AUBU C)=P(A)+P(B)+P(C)–P(B  C)-P(A  B)–P(A  C)+P(A  B  C)
Example based on the given Venn-diagram, below, determine the P(A U B U C).

A B

0.18
0.06 0.24 10

0.22
0.06 0.14
Solution 0.06
P(S) = 1
P (A U B U C) = P (A) +P (B) +P(C)–P (B  C)-P (A  B)–P (A  C)+P(A  B  C)
= 0.58 + 0.78 + 0.48 – 0.36 – 0.46 – 0.28 + 0.22
= 0.96
4.3 Conditional probability and Bayes’ theorem

Conditional probability

Difficulties can easily arise when probabilities are quoted without specification of the sample
space. For instance if we ask for the probability that an economist makes more than birr 240,000
per year, we may get several answers, and they may all be correct. Out of them one might apply
to all Economics school graduates, another might apply to all persons licensed as an economist, a
third might apply to all those who are actively engaged in the practice of their profession. Here,
since the choice of the sample space is by no means always self evident, it often helps to use the
symbols P(A/S) to denote the conditional probability of event A given S. In other words the
probability occurrence of something is conditional upon your information set. Thus it is the
probability occurrence of one composite outcome given that another composite outcome has
occurred.

Definition 1.1 If A and B are any two events in a sample space S, and P (A) ≠ 0, the conditional

probability of B given A is: PB / A 



P A B 
P  A
Example
1) Suppose from a fertilizer dealer enterprise an experiment had taken place and two items were
successively selected at random which is conducted from a lot containing 20 quintals
defective and 60 quintals non-defective fertilizers. Let A is the first fertilizer defective and B
is the second fertilizer defective. Then P(A) = 20/80 = ¼. But, in computing P(B), it is
assumed that this experiment is conducted without replacement, that is the defective
fertilizer is not thrown back to the lot before the next fertilizer is selected. In this case P(B)
depends on whether A has occurred or not. The second fertilizer selected being defective is
event B.
There fore P(B/A), the probability of B where A has occurred is P(B/A)= 19/79. This is because
after the first selection was found to be defective only 19 defective items remain from the

11
remaining 79 quintals of fertilizers. Hence applying the general rule of multiplication, the
probability that both fertilizers are defective is given by:
P(A n B) = P(A) P(B/A) = (20/80 ) (19/79) = 380/6320 = 38/632 = 0.06
Example
2) Suppose a die is loaded in such away that each odd number is twice as likely to occur as each
even number.
i. What is the probability that a number greater than 3 occurs on a single roll of the die?
ii. What is the probability that the number of points rolled is a perfect square?
iii. What is the probability that it is a perfect square given that it is greater than 3?
Solution
Let X is the event that the number of points rolled is greater than 3, and Y is the event that it is
a perfect square, we have X= {4, 5, 6}; Y= {1, 4} and X  Y= {4} and let the space contain the
elements S = {1, 2, 3, 4, 5, 6}; and let probability E to each even number and let probability 2 E
to each odd number.
Then, i. P(S) = 2 E + E + 2E + E + 2E + E = 9E = 1 by postulate 2.
 P(S) = 2/9 + 1/9 + 2/9 + 1/9 + 2/9 + 1/9 = 1
Therefore P(X) = P (4) + P (5) + P (6) = 1/9 + 2/9 + 1/9 = 4/9
ii. P(Y) = P (1) + P (4) = 2/9 + 1/9 = 3/9 = 1/3.
iii. P(Y/X) = P(X ՈY) = (1/9) = ¼.
P(X) (4/9)
If we multiply the expression on both sides of the formula of definition 1.1 by P (A), we obtain
the following rule.
Theorem 1.9 If A and B are any two events in a sample space S and P(A) ≠ 0, then
P (A Ո B) = P (A). P (B/A)  conditional but not independent.
Alternatively if P(B)≠ 0, the probability that A and B will both occur is the product of the
probability of B and the conditional probability of A given B. i.e., P(B Ո A) = P(B). P (A/B).

Example

1. If you randomly pick two sacks of teff in succession from a store of 240 sacks of teff of
which 15 are defective, what is the probability that they will both be defective?
Solution: Assuming that there is an equal probability for each selection and let A is the 1 st
defective sack of teff and B – is the 2nd defective sack of teff. Hence, the probability that the 1st
sack of teff will be defective is P(A) = 15/240, and the probability that the 2nd sack of teff will
be defective given that the 1st sack of teff is defective is P(B/A)=14/239. Thus the probability
that both sacks of teff will be defective is P(A n B) = P(A). P(B/A) = 15/240 X 14/239 = 7/1912
it is called sampling without replacement, because the first sack of teff is not replaced before the
2nd sack of teff is selected.
2. Based on the above example, what is the probability that both will not be defective?

12
Solution: If A is the 1st defective item, then A1 is 1st non defective, and analogically B1 is the 2nd
non-defective sack of teff hence, P (A1 n B1) = P (A1).P (B1/A1) = 225/240 X 224/239.Because P
(A1) = 225/240; P (B1 /A1) =224/239.
3. If you randomly roll two fair dice such that the events A and B: A = {(X,Y) / X + Y = 11}
and B = {(X,Y) / X > Y}, where X is the element from the first die, Y-the element from the
2nd die. What is the probability of B given A (or what is the probability of having X > Y
given X + Y = 11)
Solution:
1st let’s find event A = {(5, 6); (6, 5)}
=> P (A) = 2/36
nd
2 event B = {(2, 1), (3, 1),(4,1), (5, 1), (6, 1), (3, 2), (4, 2), (5, 2), (6, 2), (4,3), (5, 3),(6, 3),
(5, 4), (6, 4), (6, 5)}
=> P (B) = 15/36
rd
3 the event (A  B) = {(6, 5)}
=> P (A  B) = 1/36
Therefore P (B/A) = P (A  B) = 1/36 = ½.
P (A) 2/36
Theorem 1.10 If A, B, and C are any three events in a sample space S such that
P(A  B) ≠ 0, then P(A  B  C) = P(A). P(B/A). P[C/(A  B)]
Proof: - It is known that A  B  C= (A  B)  C using theorem 2.9 we get:
P(A  B  C)=P[(A  B)  C]=P(A  B).P[C/(A  B)]= P(A). P (B/A). P[C/(A  B)]
Example:-
1. From whole sales a container of fertilizer contains 20 sacks of fertilizers of which 5 racks
are defective. If 3 of the sacks are selected at random and removed from the container in
succession with out replacement, what is the probability that all 3 sacks are defective?
Solution: - Let: A. is the event 1st sack of fertilizer is defective
B.” ” 2nd ” ” ”
rd
C.” ” 3 ” ” ”
Then P(A) = 5/20; P(B/A) = /19; and P[C/ (A  B)] = 3/18, and substitution in to the formula
4

gives: P(A  B  C) = P(A).P(B/A). P[C/A  B)] = 5/20 x 4/19 x 3/18 =1/14.

4.3.2 Independent events


We may confront rarely in our real world that the occurrence of either one event does not affect
the probability of the occurrence of another. In other words two events say A and B are
independent if the occurrence or nonoccurrence of either one does not affect the probability of
the occurrence of the other.
Definition 1.2 Two events A and B are independent if and only if P(A  B)=P(A) . P(B)
We can here, elaborate the above definition as follows:
Two events A and B are said to be independent of each other if and only if:
i. P(A/B) = P(A) or

13
ii. P(B/A) = P(B)
iii. P(A  B) = P(A). P(B)
Example
1. An air craft has two independent safety systems. The probability that the 1st will not operate
properly in an emergency P (A) is 0.01, and the probability the 2nd will not operate P(B) in
an emergency is 0.02. What is the probability that in an emergency both of the safety
systems will not operate?
Solution:
The probability that both will not operate is:
P (A  B) = P(A). P (B) = 0.01 x 0.02 = 0.0002
1
Theorem 1.11 If A and B are independent, then A and B are also independent.
I.e. P (A  B) = P(A).P(B)  P(AnB1) = P(A).P(B1)
Proof: - since A= (A  B)  (A  B1)  are mutually exclusive
P (A) = P [(A  B)  (A  B )]1

P (A) = P [(A  B) + P (A  B1) – postulate 3 and theorem 1.1


P (A) = P (A). P (B) + P (A  B1) – given that A and B are independent
P (A) - P (A). P (B) = P (A  B1) – algebra
P (A) [1 –P (B)] = P (A  B1)  distributive property
P (A). P (B1) = P (A  B1)  theorem 1.3. And hence that A and B1 are
independent.
Definition 1.3. Events A1, A2… Ak are independent if and only if the probability of the
intersection of any 1, 2, or k of these events equal the product of their respective probabilities.
 k  k
I.e. P (A1  A2  …..  AK) = P (A1) P (A2).P(A3) ….P (AK) = P  Ai    P  A .
i
 i 1  i 1
Example:
1) Using the figure below, show that P (A  B  C) = P (A). P (B). P(C).

A B 0.02

0.18
0.06 0.24
0.24
0.06 0.14

0.06
C

14
Solution: - P (A) = 0.06 +0.06 + 0.24 + 0.24 = 0.6
P (B) = 0.24 + 0.24 + 0.14 + 0.18 = 0.8
P (C) = 0.06 + 0.06 + 0.14 + 0.24 = 0.5
P (AnBnc) = 0.24 and P (AnBnC) = P (A).P (B). P(C) = 0.6 x 0.8 x 0.5 = 0.24
1.3.2. Bayes’ theorem
Theorem 1.12. If the events B1, B2,,,BK constitute a partition of the sample space S and P(Bi)
≠ 0 for I = 1,2,3 ,,,,,k , then for any event A in S

P( A)   PBi .PA / Bi  .
k

i 1

Example:-
1. The completion of a given job in a textile factory may be delayed due to the
Machineries problem. The probabilities are 0.55 that it will be defective machinery. 0.75
That the job will be completed on time, if there is good machinery, and 0.35 that the job
will be completed on time if there is defective machinery. What is the probability that the
job will be completed on time?
Solution: Let A – the event, that the job will be completed on time, B is the event, that there
will be defective machinery, B1 is the event that there will be good machinery.
Given: - P (B) = 0.55  P(B1) = 1- 0.55 = 0.45 and A  B and A  B1 are mutually
exclusive and alternative rules of multiplication
P (B) = 0.75
P (A/ B1) = 0.75
P (A/B) = 0.35
P (A) =P[(A  B)  (A  B1)] = P(A  B)+ P (A  B1)=P(B). P(A/B) + P(B1) P(A/ B1)
= 0.55 X 0.35 + (0.45) x 0.75
= 0.1925 + 0.3375 = 0.53
Theorem 1.13. If B1, B2, ……, BK constitute a partition of the sample space S and
P(Bi) ≠ o for i =1,2,…, K; then for any event A and S such that P(A) ≠ 0

PBr .PA / Br 
PBr / A  , for r = 1,2,3, ……, K
 pB .PA / B 
k

i r
i 1

It is possible to understand easily that theorem 1.13 is the combination of theorem 1.9 and
theorem 1.12. And therefore we can restate the theorem as follows.
If an event A can only occur in conjunction with one of the K mutually exclusive and
exhaustive events, B1, B2,…, BK and preceded by the particular event Bi (i=1,2,…,k), A is
given by.

15
PA  Bi  PBr .PA / Br  PBr  A
PBr / A    ,this is called Bayes’
P  A
 PB .PA / B   PB .PA / B 
k k

i i i i
i 1 i 1

theorem.
Proof: - Since A can occur in combination with any of the mutually exclusive and
exhaustive events B1 B2, ……., BK we have A= (A  B1)  (A  B2)  …..  (A 
BK).
P (A) = P (A  B1) +P(A  B2) + …. +P(A  Bk)

 PB .PA / B  for


k
= P (B1). P (A/B1) + P (B2). P (A/B2) + ….+P (Bk). P (A/Bk) = i i
i 1

any particular event Br, the conditional probability P(Br/A) is given by P(Br n A) = P(A).
P(Br/A)
P   A P .PA / Br 
 PBr / A  Br  k Br
P  A
 PBi .PA / Bi 
i 1

Example:
1. Aba Tour firm rents cars for its tourists from their rental agencies such as Alfa, gamma and
delta rental agencies: 55% from agency Alfa, 35% from agency gamma and 10% from
agency delta. If 12% of the cars from agency Alfa need a tune up, 16% of the cars from
agency gamma need a tune up and 8% of the cars from agency delta need a tune up. What is
the probability that a car came from rental agency gamma?
Solution: i) first lets’ find the probability that the rental car will need a tune up and let  -Alfa;
G- gamma and D- delta. And if A is the event that a car needs a tune up.
P(  ) = 0.55; P(G) = 0.35; P(D) = 0.10 and P(A/  ) = 0.12; P(A/G) = 0.16; P(A/D) = 0/08
Therefore P(A) = P(  ). P(A/  ) + P(G) . P(A/G) + P(D).P(A/D) = 0.55 X 0.12 + 0.35 X 0.16 +
0.10 X 0.08
= 0.066 + 0.56 + 0.008 = 0.13
ii) then the probability that it come from the rental agency gamma will be
PG .PA / B2 
PG / A 
0.35X 0.16 0.056
   0.431.
P .P A /    PG .P A / G   PD .P A / D  0.66  0.56  0.008 0.13

16
4.4 Probability Distributions

Probability distribution: is a list of all the possible outcomes of an experiment and the probability
associated with each outcome.

Example: Suppose we are interested in the number of heads showing face up on 3 tosses of coin. This is
the experiment and the possible outcomes are 0 heads, 1 head, 2 head, and 3 heads. What is the
probability distribution for the number of heads?

Solution: The experiment has 8 possible outcomes, and below is the list of all the outcomes.

Possible Coin toss No. of heads


result
1st 2nd 3rd

1. T T T 0

2. T T H 1

3. T H T 1

4. T H H 2

5. H T T 1

6. H T H 2

7. H H T 2

8. H H H 3

From the above table, the probability distribution for the number of heads is

No. of heads, x P (outcome), P (x)

0 1/8

1 3/8

2 3/8

3 1/8

Total 1

4.4.1 Random variables.

A random variable is a quantity resulting from an experiment that can assume different values.

17
In any experiment of chance, the outcomes occur randomly. For example, rolling a single die is an
experiment; and any one of the six possible outcomes can occur at a time.

A random variable may be either discrete or continuous.

i. Discrete random variable: a variable that results from counting and can assume only certain clearly
separated values of some item of interest.

Example: The number of heads in flipping a fair coin 5 times.

ii. Continuous random variable: a variable that results from measuring and can take any value with in a
certain range of values.

Example: The distance b/n Sodo & Addis Ababa could be 330 km, 330.5 km, 331.5 km. and soon;
depending on the accuracy of our measuring device.
4.4.2. Discrete probability distributions (probability mass function), expectation and variance of
discrete random variable

If we organize a set of discrete random variables in a probability distribution, the distribution is called a
discrete probability distribution; it is also called probability mass function (pmf). And it can be
summarized by its mean and variance.

Mean: The mean of a probability distribution is also referred to as expected value, E (x), and is given by

Mean = E (x) =∑(x p(x))

P(x)= p (the possible value of random variable x).

Variance & standard deviation: Though the mean is a typical value used to summarize a discrete
probability distribution, it does not describe about the spread in the distribution, but the variance does
this.

2
= ∑ (( ) ( ))= 2
= ∑x2p(x) – 2

Standard deviation (δ) = var iance

Example: the following is the probability distribution for the number of cars a company expects to sell on
a particular day.

No. of cars sold, x Probability. P(x)

0 0.1

1 0.2

2 0.3

18
3 0.3

4 0.1

Total 1.0

1. What type of distribution is it?

2. On a typical day, how many cars does the company expect to sell?

3. What is the variance of the distribution? What is the standard deviation?

Solution:

1. It is a discrete probability distribution.


2. = E (x) =∑(x p(x))
= 0(0.1) +1(0.2) +2(0.3) +3(0.3) +4(0.1)
= 2.1.

Interpretation: Over a large number of days, the company expects to sell 2.1cars a day. Of course, it is not
possible for him to sell exactly 2.1 cars on any particular day; thus the mean is sometimes called the
expected value.

3. 2
= ∑x2p(x) – 2
= (02(0.1)+12(0.2)+…+42(0.1)) - (2.1)2 = 1.29
=  =  1.136
2 1.29

4.4.3. Common discrete problem distributions

1. Binomial distribution.

It is used to represent the probability distribution of discrete random variables. Binomial means two
categories. The successive repetition of an observation (trial) may result in an outcome which possesses
or which does not possess a specified character. Our primary interest will be either of these possibilities.
Conventionally, the outcome of primary interest is termed as success. The alternative outcome is termed
as failure. These terminologies are used irrespective of the nature of the outcome. For example, non-
germination of a seed may be termed as success.
Properties:
1. There must be only two mutually exclusive outcomes: success or failure.
2. The probability of success, p, and the probability of failure, q=1-p, remains constant from one
trial to another.
3. The probability of success in one trial is totally independent of any other trial.
4. The experiment can be repeated many times
Example: The coin flip experiment has only two possible outcomes: head or tail. The probability of each
is known and constant from one trial to another. We can flip a coin many times.
The binomial distribution is computed by

P( x) n c x ( p x )(q n x )
C = combination

19
n= number of trials
x=number of successes
p=the probability of success
q=1-p=the probability of failure
Mean of a binomial distribution
= np
Variance of a binomial distribution
2
= npq
Example: There are 5 flights daily from Addis Ababa to Washington, suppose the probability that any
flight arrives late is 0.2. What is the probability that

a. None of the flights are late today?


b. Exactly one flight is late today?
c. Construct the entire probability distribution
d. What is the probability that less than 3 flights are late?
e. What is the probability that more than 4 flights are late?
f. Between 2 and 4 (inclusive) flights are late?
g. Exactly 2 flights are not late?
h. What is the mean?
i. What is the variance?
Solution: given that the probability of a particular flight is late is 0.2, and thus the probability that a
particular flight is not late is 0.8. There are 5 flights, so n = 5, and x refers to the number of successes. In
the questions a to e, we are asked about the late flights, so here let success = late flight. Then p = 0.2, and
q = 0.8.

a. P (none of the flights are late today) = P (0 flights are late) = P (x = 0)


P( x) n c x ( p x )(q n x )

P(0) 5 c0 (0.2 0 )(0.850 ) =0.3277

b. P (exactly one flight is late today) = P (1 flight is late) = P (x = 1)


P(1) 5 c1 (0.21 )(0.851 )  0.4096

c. The entire distribution is


Number of P (x)
late flights, x

0 0.3277

1 0.4096

2 0.2048

3 0.0512

4 0.0064

20
5 0.0003

Total 1.0000

d. P (less than 3 flights are late today) = P (x < 3) = P (x = 0) + P (x = 1) + P (x = 2)


From the above table P (x < 3) = 0.3277 + 0.4096 + 0.2048 = 0.9421

e. P (x > 4) = P (x = 5) = 0.0003
f. P (2 ≤ x ≤ 4) = P (x = 2) + P (x = 3) + P (x = 4) = 0.2048 + 0.0512 + 0.0064 = 0.2624
g. P (exactly 2 flights are not late) = ?
Here we are asked about the not late flights, so we let success = not late flights.

So p=0.8, and q=0.2

Then P (exactly 2 flights are not late) = P(2) 5 c2 (0.8 2 )(0.252 )  0.0512

h. = np = 5 * 0.2 = 1 late flight or 5 * 0.8 = 4 not late flights


2
i. = npq = 5 * 0.2 * 0.8 = 0.8
1. The Poisson distribution
The Poisson distribution is also used to represent the probability distribution of a discrete random
variable. It is employed in describing random events that occur rarely over some unit of time or
space.

Examples of events where Poisson probability function can be used:

 Number of telephone calls per hour


 Number of typing errors per page
 Number of accidents on a particular road per day
 Hospital emergencies per day,
etc

Assumptions:
1. The probability of occurrence of an event is constant for any two intervals of time or space
2. The occurrence of an event in any interval is independent of the occurrence in any other interval.
Having these assumptions, the Poisson distribution is given by the function
x e 
P (x) =
x!

Where x = the number of times the event has occurred

 = is the mean no. of occurrences per unit of time or space.


e = 2.71828, the base of the natural logarithm system.
Example: Simple observation over the past 80 hours has shown that 800 customers have entered the shop.
What is the probability that

a. exactly 5 customers will arrive during any given hour?


b. more than 3 customers will arrive during any given hour?

21
c. exactly 5 customers will arrive during any 30 minutes?
800
Solution:  =  10 customers
80 hour

105 2.7182810
a. P (x = 5) =  0.0378
5!
b. P (x > 3) = P (4) + P (5) + …
by the complement rule that we have discussed earlier P (x > 3) = 1 – P (x ≤ 3)

100 e 10 101 e 10 10 2 e 10 103 e 10


= 1  P(0)  P(1)  P(2)  P(3) = 1 -     
0! 1! 2! 3!

= 1 – (0.0103) = 0.989

c. P (x = 5/30 minutes)
Here, as we are asked per 30 minutes, we should change the μ value per 30 minutes; thus

800
=  10 customers
hour
 10 customers
60 min utes
 5 customers
30 min utes
80

5 5 2.718285
P (x = 5) =  0.175
5!

4.5 Continuous probability distribution

Continuous probability distribution is also called probability density function (pdf)

Let x be a continuous random variable, then the pdf of x is a function f(x), such that for any two numbers
a and b with a b

P (a ) = Pa  x  b    f ( x)dx
a

Which is the area under the curve bounded by x=a and x=b

If f(x) is pdf of x

1. f(x) 0 for all x



2.
 f ( x)dx  1

i.e. area under the graph of f(x) must equals 1, since the sum of relative frequencies is 1.

Example: The diameter of an electronic cable, say x, is assumed to be continuous random variable with
pdf f(x)=6x(1-x), 0

22
1. Check f(x) is pdf
2. Determine number ‘b’ such that P(x<b)=P(x>b)

So/n: 1. To check f (x) is pdf, we should check the two points


i.f(x) 0 for all x Simple trial and error check can show us f (x) 0

ii.
 f ( x)dx  1

1 1 1 1
6x 2 6x3
 6 x(1  x)dx   (6 x  6 x )dx   6 xdx   6 x dx    3 2 1
2 2 1 1
0 0
0 0 0 0
2 3
2. P(x<b)=P(x>b) means P ( P (  x  b)  P (  x  b)
= P (  x  b)  P (b  x  )
b 
  f ( x)dx   f ( x)dx
 b
b 
  6 x(1  x)dx   6 x(1  x)dx
 b
b 1
  6 x(1  x)dx   6 x(1  x)dx
0 b
b 1
 6x 2 6x3   6 x 2 6 x3 
       
 2 3 0  2 3 b
    
 3b 2  2b 3  3(0) 2  2(0) 3  3(1) 2  2(1) 3  3b 2  3b 3   
 3b 2  2b 3  1  3b 2  2b 3
 4b 3  6b 2  1  0
Then we can solve mathematically for b, and we will take the value of b that lies in the given range of the
function only.

Expected value and variance of a continuous random variable:


1. E(x) = μ =
 xf ( x)dx

 

 (x  ) f ( x)dx  x f ( x)dx   2
2 2
2. Var (x)==
 

Example: Calculate the E(x) and Var (x), for the following function

f(x) = 2x, 0
So/n: 1. E (x) =

23
 1
1
 2x3  2
1
   xf ( x)dx   x(2 x)dx   2 x     2

 0 0  3 0 3

b. var (x)=
 1
 2x 4 
2
2
1 1
4 4 2 4 2 1
  x f ( x)dx     x 2 xdx      2 x 3 dx 
2 2 2
      
 0 3 0
9  4  0 9 4 9 36 18
The cumulative density function (cdf), F(x)

If x is a continuous random variable with pdf, f(x), then

x
F(x)= P (X x) =
 f (t )dt;  x  


Properties

1. 0 F(x) 1
2. F ' ( x)  f ( x)
3. F(- )= 0, F( )=1
4. P (a  x  b)  F (b)  F (a )
Example Given f(x) = 6x (1-x), 0 ,

1. Find F(x)
2. what is the P (0.3  x  0.8)
So/n: 1. F (x) =
x x

  f (t )dt;  x     6t (1  t )dt;0  x  1
 0
x x
x x
 6t 2   6t 3 
  6tdt   6t dt   2
   
0 0  2  0  3 0
=> F ( x)  3x  2 x
2 3

2. P (0.3  x  0.8) = F (0.8) – F (0.3) = (3(0.8)2–2(0.8)3) – (3(0.3)2–2(0.3)3)

4.6. Common continuous probability distributions

1. Normal distribution (N-distribution)

It is the most important distribution in describing a continuous random variable and used as an
approximation of other distribution. A random variable X is said to have a normal distribution if its
probability density function is given by

24
1
 2  x 
2
1
f ( x)  e 2 , Where X is the real value of X,
 2
i.e. -  <x<  , - ∞<µ<∞ and σ>0

Where µ=E(x) (σ) 2 = variance(X)

µ and (σ) 2 are the Parameters of the Normal Distribution.

Properties of Normal Distribution:

1. It is bell shaped and is symmetrical about its mean. The maximum coordinate is at
x = X

2. The curve approaches the horizontal x-axis as we go either direction from the mean.
  1
1   x   2
3. Total area under the curve sums to 1, that is  f ( x)dx  
 2
e

2
dx  1

4. The Probability that a random variable will have a value between any two points is equal to the
area under the curve between those points.
5. The height of the normal curve attains its maximum at  X this implies the mean and mode
coincides(equal)

6.4.2 Standard normal Distribution

It is a normal distribution with mean 0 and variance 1.Normal distribution can be converted to standard
normal distribution as follows. If X has normal distribution with mean  X and standard deviation , then
x
the standard normal distribution devariate Z is given by Z=

2
1 z
P (Z) =
2
e 2

Properties of the standard normal distribution:

 The same as normal distribution, but the mean is zero and the variance is one.
 Areas under the standard normal distribution curve have been tabulated in various ways. The
most common ones are the areas between Z = 0 and a positive value of Z.

25
Given a normal distributed random variable X with mean µ and standard deviation σ.

b x a


P (a<X<b)  P (   )
  

 x a  x


P( X  a )  P    Z Standard normal r.v.
 
But,
  

 a
 PZ  
  

Note: i) P (a<x<b) = P (a<=X<b)

= P (a<X<=b)

=P (a<=X<=b)

ii) P (   Z )  1

iii) P  a  Z  b   P  Z  b   P  Z  a  forq  b

Consider the situations under the standard normal curve. It is clear that

P  0  Z   0.5  P  Z  0 

i) Let Z0 be negative number then,


P  Z  Z 0   P  Z  0   P(Z 0  Z  0)

ii) If Z0 is positive real number, then


P  Z  Z 0   P  Z  0   P( Z 0  Z  0)

iii) Let Z1 be a negative number and Z2 be positive real number, then


P  Z 1  Z  Z 2   P  Z 1  Z  0   P(Z 2  Z  0)

iv) If Z1 and Z2 are positive real numbers with Z1<Z2


P  Z 1  Z  Z 2   P  Z 1  Z  0   P(Z 2  Z  0)

i.e. i) p(Z<Z0)

iv) P(Z1<Z<Z2) ii) p(Z>Z0)

0 Z1 Z2 0 Z0 Z0 0

26
iii) p (Z1<Z<Z2)

Z1 0 Z2

As the value of  increases, the curve becomes more and more flat and vice versa.

Examples: - For a standard normal variable Z find

a) P(-2.2 <Z<1.2) c) P(0<Z<0.96)


b) P(Z>1.05) d) p(-1.45 <Z<0)

Solution: a)

-2.2 1.2

P (-2.2<Z<1.2) = P (0<Z<1.2) +p (-2.2<Z<0)

= p (0<Z<1.2) +P (0<Z<2.2)

= 0.3849+0.4861

= 0.8710

b)

= P (Z>1.05) = 1 - P (0<Z<1.05)

= 1-0.8531 = answer

c) P (0<Z<0.96) = 0.3315

d) P (-1.45 <Z<0) = P (0<Z<1.45) = 0.4265

27
NOTE: By determining the z- value, we can find the area or the probability under any normal curve by
referring to the standard normal distribution table.
How to use the Normal distribution table to determine probabilities

a. If you wish to find the area between 0 and Z (or – Z), look up the value directly in the table.
Example: P (0 < Z < 0.96) = 0.3315
Example: P (-0.96 < Z < 0) = P (0 < Z < 0.96); because the curve is symmetric to z = 0
= 0.3315

b. To find area between two points on the different sides of the mean, add the corresponding areas found
in the N table.
Example: P (-2.2 < Z < 1.2) = P (-2.2 < Z < 0) + P (0 < Z < 1.2)

=P (0 < Z < 2.2) + P (0 < Z < 1.2)

=0.4861 + 0.3849

= 0.8710

c. To find the area between two points on the same side of the mean, determine the areas related to the
two values from the table, and then subtract the smaller area from the larger.
Example: P (0.96 < Z < 1.2) = P (0 < Z < 1.2) – P (0 < Z < 0.96)

= 0.3849 – 0.3315

= 0.0534

Example: P (-1.2 < Z < -0.96) = P (-1.2 < Z < 0) – P (-0.96 < Z < 0)

= P (0 < Z < 1.2) – P (0 < Z < 0.96); because the curve is


symmetric to z = 0

= 0.3849 – 0.3315

= 0.0534

d. To find the area beyond Z (or -Z) value towards the same direction, look the value of Z directly from
the table, and then subtract it from 0.5.
Example: P (Z > 1.05) = 0.5 – (0 < Z < 1.05)

= 0.5 – 0.3531

= 0.1469

Example: P (Z < -1.05) =0.5 – P (-1.05 < Z < 0)

=0.5 – P (0< Z < 1.05); because the curve is symmetric to z = 0

28
=0.5– 0.3531

=0.1469

e. To find area beyond Z (or –Z) value towards the different direction, look the value of Z directly from
the table, and then add the probability with 0.5.
Example: P (Z > -1.05) = P (-1.05 < Z < 0) + P (0 < Z <  )

= P (0 < Z < 1.05) + 0.5

= 0.3531 + 0.5

= 0.8531

Example: P (Z < 1.05) = P (   < Z < 0) + P (0< Z < 1.05)

= P 0.5 + 0.3531

= 0.8531

Example: The average satellite transmission is 150 seconds, with a standard deviation of 150 seconds.
Time appears to be normally distributed. What is the probability that a call will last

a. between 125 and 150 seconds e. less than 125 seconds


b. between 145 and 155 seconds f. between 160 and 165 seconds
c. more than 175 seconds g. between 135 and 140 seconds
d. less than 160 seconds h. more than 140 seconds
e.
So/n: Given = 150 = 15, and let x = time

a) P (125 < x < 150)


 125   x   150   
= P   
    
( )

( ) = P (-1.67 < Z <O)

Note: The rest are exercise.


The t- distribution (student’s t distribution)

Suppose we have a sample X1….., Xn from a N population that has mean (unknown) and standard
deviation (unknown); and using this sample data, we want to develop an interval estimator of the
population mean . Then X N ( , 2/n)

29
Z= has a standard normal distribution, but is unknown so that we can substitute it by its point

estimator S. Hence, now tn-1 = , is said to follow a t-distribution with n-1 df.; this is true if the n is not

sufficiently large (n ≤≠ 30)

N.B. df = degrees of freedom- is the number of independent observation in a set of observations

Note: tα (v) stands for a value of t with v df to the right of which an area equals to lies.

Example: t0.025, (12) = 2.179 means P (t (12) ) = 0.025

t0.01 (25) = 2.485 means P (t (25) ) = 0.01

Exercises

1. Suppose 20% of the population is victims of crime. In a family of 5, what is the probability that none of
the family is a crime victim?

2. Consider a random variable X that takes a value either 1 or 0 with respective probabilities P and 1-P.
find the expected value as well as the variance of the r.v.

3. The probability that a student entering a college will graduate is 0.4. Determine the probability that out
of 5 students (a) none, (b) one (b) at least one (a) at most three will graduate.

4. Find the area under the standard normal curve bounded by:

________________ (a) Z= -1.3 and Z= 1.39

________________ (b) Z= 0.73 and Z= 1.36

________________ (c) Z=-2.43 and Z=-1.56

5. A production engineer finds that on an average mechanics working in a machine shop complete a
certain task in 15 minutes. The time required to complete the task are approximately normally distributed
with the standard deviation of 3 minutes. Find the probabilities that the task is completed:

________________ (a) in less than 8 minutes,

________________(b) in more than 9 minutes, and

________________(c) between 10 and 12 minutes

30
31
CUMULATIVE AREA OF THE STANDARD NORMAL CURVE from 0 to z
z
( z )   exp( z
2
TABLE OF 1
2
/ 2)dz
0

0. .01 .02 .03 .04 .05 .06 .07 .08 .09


0. .0000 .0040 .0080 .0120 .0160 .0199 .0239 .0279 .0319 .0359
0.1 .0398 .0438 .0478 .0517 .0557 .0596 .0636 .0675 .0714 .0754
0.2 .0793 .0832 .0871 .0910 .0948 .0987 .1026 .1064 .1103 .1141
0.3 .1179 .1217 .1255 .1293 .1331 .1368 .1406 .1443 .1480 .1517
0.4 .1554 .1591 .1628 .1664 .1700 .1736 .1772 .1808 .1844 .1879
0.5 .1915 .1950 .1985 .2019 .2054 .2088 .2123 .2157 .2190 .2224
0.6 .2257 .2291 .2324 .2357 .2389 .2422 .2454 .2486 .2517 .2549
0.7 .2580 .2611 .2642 .2673 .2704 .2734 .2764 .2794 .2823 .2852
0.8 .2881 .2910 .2939 .2967 .2995 .3023 .3051 .3078 .3106 .3133
0.9 .3159 .3186 .3212 .3238 .3264 .3289 .3315 .3340 .3365 .3389
1.0 .3413 .3438 .3461 .3485 .3508 .3531 .3554 .3577 .3599 .3621
1.1 .3643 .3665 .3686 .3708 .3729 .3749 .3770 .3790 .3810 .3830
1.2 .3849 .3869 .3888 .3907 .3925 .3944 .3962 .3980 .3997 .4015
1.3 .4032 .4049 .4066 .4082 .4099 .4115 .4131 .4147 .4162 .4177
1.4 .4192 .4207 .4222 .4236 .4251 .4265 .4279 .4292 .4306 .4319
1.5 .4332 .4345 .4357 .4370 .4382 .4394 .4406 .4418 .4429 .4441
1.6 .4452 .4463 .4474 .4484 .4495 .4505 .4515 .4525 .4535 .4545
1.7 .4554 .4564 .4573 .4582 .4591 .4599 .4608 .4616 .4625 .4633
1.8 .4641 .4649 .4656 .4664 .4671 .4678 .4686 .4693 .4699 .4706
1.9 .4713 .4719 .4726 .4732 .4738 .4744 .4750 .4756 .4761 .4767
2.0 .4772 .4778 .4783 .4788 .4793 .4798 .4803 .4808 .4812 .4817
2.1 .4821 .4826 .4830 .4834 .4838 .4842 .4846 .4850 .4854 .4857
2.2 .4861 .4864 .4868 .4871 .4875 .4878 .4881 .4884 .4887 .4890
2.3 .4893 .4896 .4898 .4901 .4904 .4906 .4909 .4911 .4913 .4916
2.4 .4918 .4920 .4922 .4925 .4927 .4929 .4931 .4932 .4934 .4936
2.5 .4938 .4940 .4941 .4943 .4945 .4946 .4948 .4949 .4951 .4952
2.6 .4953 .4955 .4956 .4957 .4959 .4960 .4961 .4962 .4963 .4964
2.7 .4965 .4966 .4967 .4968 .4969 .4970 .4971 .4972 .4973 .4974
2.8 .4974 .4975 .4976 .4977 .4977 .4978 .4979 .4979 .4980 .4981
2.9 .4981 .4982 .4982 .4983 .4984 .4984 .4985 .4985 .4986 .4986
3.0 .4987 .4987 .4987 .4988 .4988 .4989 .4989 .4989 .4990 .4990
3.1 .4990 .4991 .4991 .4991 .4992 .4992 .4992 .4992 .4993 .4993
3.2 .4993 .4993 .4994 .4994 .4994 .4994 .4994 .4995 .4995 .4995
3.3 .4995 .4995 .4995 .4996 .4996 .4996 .4996 .4996 .4996 .4997
3.4 .4997 .4997 .4997 .4997 .4997 .4997 .4997 .4997 .4997 .4998
3.5 .4998 .4998 .4998 .4998 .4998 .4998 .4998 .4998 .4998 .4998
3.6 .4998 .4998 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999
3.7 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999
3.8 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999 .4999
3.9 .5000 .5000 .5000 .5000 .5000 .5000 .5000 .5000 .5000 .5000

32
   0.10   0.05   0.025   0.01   0.005 

1 3.078 6.314 12.706 31.821 63.657 1

2 1.886 2.920 4.303 6.965 9.925 2

3 1.638 2.353 3.182 4.541 5.841 3

4 1.533 2.132 2.776 3.747 4.604 4

5 1.476 2.015 2.571 3.365 4.032 5

6 1.440 1.943 2.447 3.143 3.707 6

7 1.415 1.895 2.364 2.998 3.499 7

8 1.397 1.860 2.306 2.896 3.355 8

9 1.383 1.833 2.262 2.821 3.245 9

10 1.372 1.812 2.228 2.764 3.169 10

11 1.363 1.796 2.201 2.718 3.106 11

12 1.356 1.783 2.179 2.681 3.054 12

13 1.350 1.771 2.160 2.650 3.012 13

14 1.345 1.761 2.145 2.624 2.977 14

15 1.341 1.753 2.131 2.603 2.947 15

33
16 1.337 1.746 2.120 2.583 2.921 16

17 1.333 1.740 2.110 2.567 2.898 17

18 1.330 1.734 2.101 2.552 2.878 18

19 1.328 1.729 2.093 2.539 2.861 19

20 1.325 1.725 2.086 2.528 2.845 20

21 1.323 1.721 2.080 2.518 2.831 21

22 1.321 1.717 2.074 2.508 2.819 22

23 1.319 1.714 2.069 2.500 2.807 23

24 1.318 1.711 2.064 2.492 2.797 24

25 1.316 1.708 2.060 2.485 2.787 25

26 1.315 1.706 2.056 2.479 2.779 26

27 1.314 1.703 2.052 2.473 2.771 27

28 1.313 1.701 2.048 2.467 2.763 28

29 1.311 1.699 2.045 2.462 2.756 29

 1.282 1.645 1.960 2.326 2.576 

34

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy