0% found this document useful (0 votes)
19 views29 pages

Stat 112 Lecture Note

The document outlines a course on probability, covering fundamental concepts such as probability definitions, counting techniques, random variables, and probability distributions. It explains key terms and principles, including independent and dependent events, mutually exclusive events, and various probability definitions and properties. Additionally, it provides examples and applications of probability laws, including conditional probability, the law of total probability, and Bayes' theorem.

Uploaded by

solomonugbede3
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views29 pages

Stat 112 Lecture Note

The document outlines a course on probability, covering fundamental concepts such as probability definitions, counting techniques, random variables, and probability distributions. It explains key terms and principles, including independent and dependent events, mutually exclusive events, and various probability definitions and properties. Additionally, it provides examples and applications of probability laws, including conditional probability, the law of total probability, and Bayes' theorem.

Uploaded by

solomonugbede3
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

STA 112 (PROBABILITY)

Course Outline

➢ Introduction to Probability
➢ Counting Techniques
➢ Expectations of a Random Variables
➢ Probability Distributions

INTRODUCTION TO PROBABILITY

One of the fundamental tools of statistics is probability. It is a numerical value that measures the
degree of uncertainty surrounding the occurrence of some event. Probability Theory deals with
uncertainty. The probability of an event is the likelihood of an event occurring or not occurring. It
is a measure of a chance that a certain even will occur. Simply put probability is measured by the
ratio of number of times the event can occur to the total number of trials. It is given as
no of outcomes in event n(n)
P(E) =
total number of outcomes n(s)

Definition of Terms

a. Experiment: is a mechanism by which we generate an observation or a measurement. The


term experiment is used to refer to both controlled laboratory measurements produced by
certain process and other uncontrolled situation such as measuring the life length of electric
bulbs or measuring daily rainfall.
b. Random Experiment: is an experiment that can result in different outcomes, even though
it is repeated in the same manner every time.
c. Simple event: is a set consisting of a single possible outcome of an experiment. Example,
in a throw of a die, a simple event may be {1} or {2} or {3} or (4} or {5} or {6}.
d. Event Space: is a subset of all outcomes of the experiment. It is a union of some simple
event. Example, an event may be the occurrence of even number in the throw of a die.
e. Sample space or universal set: is a set containing all possible outcomes of a statistical
experiment. it is a collection or union of all simple events of an experiment. It is usually
represented by S. Example: In a throw of die, the sample space is S = {1,2,3,4,5,6}
f. Independent Event: Two events are said to be independent if one has no effect on the
probability of the other. P(AB) = P(A ∩ 𝐵) = P(A) . P(B)
g. Dependent Event: Two events A and B are said to be dependent if the outcome of B is
dependent on the outcome of A. P(A ∩ 𝐵) = P(A) . P(B/A)
h. Mutually Exclusive: Two events are said to be mutually exclusive if they cannot occur
simultaneously or at the same time.
i. Equally Likely: if no outcome is any more likely to occur than any other. Example, in the
die experiment to give all the occurrences equal chance of occurring, the die is place in a
cup shaken before throw.

Various Definitions of Probability

Generally, the probability of an event can be interpreted as our degree of belief that the event will
occur. However, there are three basic approaches to the definition of probability

i. Classical or Prior Theory of Probability: According to classicals, Probability of success


is based upon trying knowledge of the process. It is defined as the ratio of the number
of favorable outcomes to the total number of equally likely possible outcomes.
Shortcoming:
a) This approach cannot apply in situations in which the outcome cannot all be
regarded as equally likely.
b) The approach cannot apply when the numbers of possible outcome is infinite.
ii. Relative Frequency or Empirical or Posteriori probability: The probability of success
is based upon observed data. That is, it is based on what has actually happened. It states
that probability as is determined by observing the frequency of an event's occurrence
in a series of trials or experiments.
Shortcomings:
a) This approach cannot apply in a situation that the extent of how large the number
of observations must be before it will be considered large.
iii. Probability Axioms:
a. Axiom 1: the probability of an event is a non-negative real number. That is
P(E) ≥ 0 for any subsect E of S
b. Axiom 2: P (S) = 1
c. Axiom 3: If E1, E2, …, is a finite or infinite sequence of mutually exclusive event of
S, then P(E1∪E2∪E3∪…) = P(E1) + P(E2) + P(E3) + ….

Properties (Laws) of Probability


1. The probability of any event can never by negative i.e. P(event) ≥ 0
2. The probability of any event can never be greater than one i.e. P(event) ≤ 1
3. The probability of certainty or sure event is one
4. The probability of an impossible event is zero
5. The sum of the probability of all possible outcomes is one i.e. ∑P(event) = 1
6. The probability that an event E will not occur is P(Ē) = 1 – P(E)

Tree Diagram: This is a means by which all elements of the sample are symbolically listed like a
tree. It shows all possible outcomes of an experiment. Example: If a coin is tossed twice, find the
probability of getting a head.

H HH

H T HT

H TH

T T TT
1
S = {HH, HT, TH, TT}; Note if the coin if fair P(H) = P(T) = 2

1 1 1
Such that P(HH) = P(H) x P(H) = 2 𝑋 =4
2

1 1 1
P(TH) = P(T) x P(H) = 2 𝑋 =4
2

P(of one head) = P(HT or TH) = P(HT) + P(TH)

1 1 2 1
=4 + =4=2
4

P(of at least one head) = P(HT or TH or HH) = P(HT) + P(TH) + P(HH)

1 1 3 3
=4 + +4 =
4 4
Example 1: A bag contains 5 white balls and 4 black balls. What is the probability that the two
balls being picked are white?

Solution:

5
White balls = 5, P (white) =
9

4
Black balls = 4, P (black) =
9

5 5 25
Total = 9, P (2 whites) = 𝑥 =
9 9 81

Additive Rules

If A or B are any two mutually exclusive events, the probability of A or B is given as;

P(A or B) = P(𝐴 ∪ 𝐵) = P(A) + P(B)

Or

P(𝐴 ∪ 𝐵) = P(A) + P(B) – P(𝐴 ∩ 𝐵)

Example 2: In a toss of a fair die, if event A is for obtaining a prime number and event B for odd
number. Find the probability of event A or B

Solution

A = {2,3,5} , B = {1,3,5} , n(𝐴 ∩ 𝐵) = {3,5}

3 1 3 1 2 1
P(A) = = , P(B) = = , P(𝐴 ∩ 𝐵) = =
6 2 6 2 6 3

P(𝐴 ∪ 𝐵) = P(A) + P(B) – P(𝐴 ∩ 𝐵)

1 1 1
= + −
2 2 3

2
=
3

Example 3: A coin is tossed four times, find the probability of getting


a) Two heads and two tails
b) At least two heads
c) At most two heads
d) No tail
e) Head followed by a tail

Solution

1st Throws
2nd Throw HH HT TH TT
HH HHHH HHHT HHTH HHTT
HT HTHH HTHT HTTH HTTT
TH THHH THHT THTH THTT
TT TTHH TTHT TTTH TTTT

i) P (Two head & two tail)


A = {HHTT, HTHT, HTTH, THHT, THTH, TTHH}
6
P (A) =
16
ii) P (At least two heads)
A = {HHHH, HHHT, HHTH, HHTT, HTHH, HTHT, HTTH, THHH, THHT, THTH,
TTHH}
11
P (A) =
16
iii) P (At most two heads)
A = {HHTT, HTHT, HTTH, HTTT, THHT, THTH, THTT, TTHH, TTHT, TTTH,
TTTT}
11
P (A) =
16
iv) P (no tail)
A = {HHHH}
1
P (A) =
16
v) P (A head followed by a tail)
A = {HTHH, HTHT, HTTH, HTTT}
4
P (A) =
16

Multiplicative Rules

If A and B are two independent events, the probability of A and B is given as

P (A and B) = P(𝐴 ∩ 𝐵) = P(A) . P(B)

Example 4: Three balls are drawn without replacement from a bag containing 8 blue, 12 red and
10 yellow balls. Find the probability that

a) Three balls are of the same colour


b) Two balls are red and one is black
c) The three balls are of different colours

Solution

Blue balls = 8, Red balls = 12, Yellow balls = 10, Total balls = 30

a) P (BBB OR RRR OR YYY) = P(𝐵 ∩ 𝐵 ∩ 𝐵) or P(𝑅 ∩ 𝑅 ∩ 𝑅) OR P(𝑌 ∩ 𝑌 ∩ 𝑌)


8 7 6
= (30 𝑥 29 𝑥 28) + (12 𝑥 11 𝑥 10) + (10
30 29 28
𝑥 9 𝑥 8)
30 29 28
2 11 6
= + +
145 203 203
99
=
1015
b) P (BRR OR RRB OR RBR) = P(𝐵 ∩ 𝑅 ∩ 𝑅) or P(𝑅 ∩ 𝑅 ∩ 𝐵) or P(𝑅 ∩ 𝐵 ∩ 𝑅)
8 12
= (30 𝑥 29 𝑥 11
28
) + (
12 11
𝑥 𝑥
30 29 28
8
) + (
12 8
𝑥 𝑥 11
30 29 28
)
44 44 44
= + +
1015 1015 1015
132
=
1015
c) A = (BRY, BYR, RBY, RYB, YBR, YRB)
8 7 6
P (A) = (30 . 29 . 28) + (12 . 11 . 10) + (10
30 29 28
. 9 . 8)
30 29 28

2 11 6
= + +
145 203 203

99
=
1015

Conditional Probability

Let A and B be any two events in sample space S, then the probability that A occurs given that B
has occurred is called the conditional probability of A given B denoted by P(A/B) is defined as;

𝑃(𝐴∩𝐵)
P(A / B) =
𝑃(𝐵)

While the conditional probability of B given A is given by

𝑃(𝐴∩𝐵)
P(B / A) =
𝑃(𝐴)

Example: Find the probability that a single toss of a die results in a number less than 4 if it is
given that the toss results in an odd number
Solution

P(A) = P(1) + P(2) + P(3); P(A) = P(1) + P(3) + P(5)

1 1 1 1 1 1 1 1
= + + = = + + =
6 6 6 2 6 6 6 2

Also, P(𝐴 ∩ 𝐵) = P(1) + P(3)

1 1 1
= + =
6 6 3

1
𝑃(𝐴∩𝐵) 3 1 2 2
P(A / B) = = 1 = x =
𝑃(𝐵) 3 1 3
2

The Law of total Probability

Let A1, A2, …, An be mutually exclusive and exhaustive events. Then for any other event B,

P(B) = ∑ 𝑃(𝐴𝑖 ∩ 𝐵) = 𝑃(𝐴1 ∩ 𝐵) + 𝑃(𝐴2 ∩ 𝐵) + … + 𝑃(𝐴𝑛 ∩ 𝐵)

= ∑ 𝑃(𝐴𝑖 )𝑃(𝐵/𝐴𝑖 )

Example: A person has undertaken a mining job. The probabilities of completion of the job on
time with and without rain are 0.42 and 0.90 respectively. If the probability that it will rain is
0.45, then determine the probability that the mining job will be completed on time.

Solution

Let A be the event that the mining job will be completed on time and B be the event that it rains.
We have,

P(B) = 0.45,

P(no rain) = P(B′) = 1 − P(B) = 1 − 0.45 = 0.55

By multiplication law of probability,

P(A/B) = 0.42, P(A/B′) = 0.90

Since, events B and B′ form partitions of the sample space S, by total probability theorem, we have
P(A) = P(B) P(A/B) + P(B′) P(A/B′)

=0.45 × 0.42 + 0.55 × 0.9 = 0.189 + 0.495 = 0.684

So, the probability that the job will be completed on time is 0.684.

Bayes Theorem

Let A1, A2, …, An be a collection of n mutually exclusive and exhaustive events with P(Ai) > 0
for i = 1,…,n then for any other event B, P(B) > 0

𝑃(𝐴𝑘 ∩𝐵) 𝑃(𝐴𝑘 ) . 𝑃(𝐵/𝐴𝑘 )


P(Ak/B) = =∑
𝑃(𝐵) 𝑃(𝐴𝑖 ) . 𝑃(𝐵/𝐴𝑖 )

Proof:

P(AkB) = P(BAk)

P(Ak) P(B/Ak) = P(B) P(Ak/B)

𝑃(𝐴𝑘 )(𝐵/𝐴𝑘 )
P(Ak/B) =
𝑃(𝐵)

According to Law of Total Probability

P(B) = ∑ 𝑃(𝐴𝑖 )𝑃(𝐵/𝐴𝑖 )

𝑃(𝐴𝑘 ) . 𝑃(𝐵/𝐴𝑘 )
Therefore, P(Ak/B) =
∑ 𝑃(𝐴𝑖 ) . 𝑃(𝐵/𝐴𝑖 )
COUNTING TECHNIQUES

There are two counting techniques that can be used to deal with difficult situations.

a. Permutation
b. Combination

Factorial Notation: The factorial of a number n, denoted by n! is defined as the product of all
positive integers, from 1 to n inclusive. That is n! = n(n – 1)(n – 2)…1

Permutation

A permutation is defined as ordered selection of r out of n objects without replacement. It can also
be defined as an arrangement of all or part of a sequence of object following a particular order
without replacement. The number of permutations of n distinct object at a time is given as

𝑛!
𝑛𝑝𝑟 = (𝑛−𝑟)!
; n! = n factorial = n(n-1)(n-2)…3.2.1

Example 1: An examination paper has six questions but only four are to be answered. In how
many different ways can the answers be arranged.

Solution

6!
n = 6, r = 4, 𝑛𝑝𝑟 = 6𝑝4 = = 360 ways
(6−4)!

Example 2: How many ways can 7 people be assigned to 1 triple and 2 double canopies

Solution

n = 7, n1 = 3, n2 = 2, n3 = 2

7 7!
The total number of possible partitions would be (3!2!2!) = = 210 ways
3!2!2!

Example 3: How many distinct permutations can be made from the letter of the word
“STATISTICS”

Solution

S = 3, T = 3, I = 2, A = 1, C = 1
8!
= 280 ways
4!3!1!

Circular Permutations

Permutation that occurs when objects are arranged in a circle are called circular permutations. The
number of arrangements of n distinct objects in a circle is (n – 1)!

Example 4: In how many ways can 3 men and 2 women be seated around a conference table if
each person can sit anywhere?

Solution

Since there are 5 people to be seated round a table any person can be seated at a starting point,
leaving us with 4 people. Therefore, the number or ways = 4! = 4.3.2.1 = 24 ways.

Permutation of Indistinguishable elements

The number of permutations of n objects of which n1 are of one kind, n2 of a second kind, …, nk
𝑛!
are of kth kind and n1 + n2 + … + nk = n is 𝑛
1 !𝑛2 !…𝑛𝑘 !

Example 5: In how many ways can 3 oranges, 2 mangoes and 2 apples be arranged in a straight
line if one does not distinguish between trees of the same kind.

Solution

7!
The total number of distinct arrangements is = 210
3!2!2!

COMBINATION

Combination is defined as selection of r out of n objects without attention given to the order of
arrangement. The number of combinations of n distinct number of objects taken r at a time given
as;

𝑛 𝑛 𝑛!
𝑛𝐶𝑟 = ( ) = ( )=
𝑟 𝑟, 𝑛 − 𝑟 𝑟! (𝑛 − 𝑟)!
Example 1: From four republicans and three democrats, find the number of committee(s) that can
be formed with two republicans and one democrat.

Solution

The number of ways of selecting 2 republicans from 4 is

4 4!
( )= =6
2 2! (4 − 2)!
The number of ways of selecting 1 democrat from 3 is given as

3 3!
( )= =3
1 1! (3 − 1)!
The total number is given as 6 x 3 = 18 ways

Example 2: A committee of 7 people is to be chosen from 5 women and 8 men. The committee is
to be made up of 5 men and 2 women. How many possible ways can it be selected?

Solution

8! 8!
(85) = = = 56 ;
5!(8−5)! 5!3!

5! 5!
(52) = = = 10
2!(5−2)! 2!3!

= 56 x 10 = 560 ways
EXPECTATION OF A RANDOM VARIABLE
Let x be a finite random variable and suppose the following distribution
x x1, x2, x3, …, xn
f(x) f(x1), f(x2), f(x3), … , f(xn)
Then the mean or expectation or expected value of x denoted by E(x) is defined by
E(x) = x1f(x1) + x2f(x2) + x3f(x3) + … + xnf(xn)
E(x) = ∑ 𝑥𝑖 f(𝑥𝑖 )
Expected Theorems
If X is a discrete random variable with probability mass function f(x), the expected value of the
random variable g(x) is given by
E[g(x)] = ∑𝑥 𝑔(𝑥)𝑓(𝑥)
Similarly, if X is continuous random variable with probability density f(x), the expected value of
the random variable g(x) is given by

E[g(x)] = ∫−∞ 𝑔(𝑥)𝑓(𝑥) 𝑑𝑥

Example: Let x denote the number of times head occurs when a fair coin is tossed 3 times.
Calculate the expectation of x.
Solution
2nd Throw
1st Throw HH HT TH TT
H HHH HHT HTH HTT
T THH THT TTH TTT

Probability Distribution of x

x 0 1 2 3
f(x) 1 3 3 1
8 8 8 8

E(x) = ∑ 𝑥𝑖 f(𝑥𝑖 )

1 3 3 1
E(x) = 0(35) + 1(8) + 2(8) + 3(8)
3 6 3 3
E(x) = 0 + 8 + + 8 = E(x) = 2
8

Variance and Standard Deviation

The mean of a random variable x measures in a certain sense the average value of x. Standard
deviation on its part tries to measure the spread or dispersion. Let x be a random variable with
mean 𝜇 = E(x) following probability distribution

X x1, x2, x3, …, xn


f(x) f(x1), f(x2), f(x3), … , f(xn)
The variance of x denoted by var(x) is defined by

Var(x) = (𝑥1 − 𝜇)2 𝑓(𝑥1 ) + (𝑥2 − 𝜇)2 𝑓(𝑥2 ) + (𝑥3 − 𝜇)2 𝑓(𝑥3 ) + ⋯ + (𝑥𝑛 − 𝜇)2 𝑓(𝑥𝑛 )

= ∑(𝑥𝑖 − 𝜇)2 𝑓(𝑥𝑖 ) = E( (𝑥𝑖 − 𝜇)2

The standard deviation of x denoted by 𝜎𝑥 or simply put 𝜎 is the no negative square root of
var(x) i.e.

𝜎𝑥 = √𝑣𝑎𝑟(𝑥)

Example

If x denotes the sum obtained from an experiment of tossing a pair of dice, calculate the mean,
variance and standard deviation.

Solution

1 2 3 4 5 6
1 (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
2 (2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
S = 3 (3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
4 (4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
5 (5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
6 (6,1) (6,2) (6,3) (6,4) (6,5) (6,6)
X 1 2 3 4 5 6 7 8 9 10 11 12
f(x) 0 1 2 3 4 5 6 5 4 3 2 1
36 36 36 36 36 36 36 36 36 36 36

Mean = 𝜇 = E(x) = ∑ 𝑥𝑖 f(𝑥𝑖 )

1 2 3 1
= 2(36) + 3 (36) + 4 (36) + ⋯ + 12(36)

252
E(x) = =7
36

Variance

X f(x) xf(x) 𝑥𝑖 − 𝜇 (𝑥𝑖 − 𝜇)2 (𝑥𝑖 − 𝜇)2 𝑓(𝑥𝑖 )


2 1 2 -5 25 25
36 36 36
3 2 6 -4 16 32
36 36 36
4 3 12 -3 9 27
36 36 36
5 4 20 -2 4 16
36 36 36
6 5 30 -1 1 5
36 36 36
7 6 42 0 0 0
36 36
8 5 40 1 1 5
36 36 36
9 4 36 2 4 16
36 36 36
10 3 30 3 9 27
36 36 36
11 2 22 4 16 32
36 36 36
12 1 12 5 25 25
36 36 36
Total 210
36
Var(x) = ∑(𝑥𝑖 − 𝜇)2 𝑓(𝑥𝑖 )

210
= = 5.38
36

SD = 𝜎 = √𝑣𝑎𝑟(𝑥)

= √5.38 = 2.42
PROBABILITY DISTRIBUTION

Random Variable

A random variable is a numerical quantity or the value of which to be determined by an experiment.


in ither words, it’s a value determined by chance. It can also be defined as characteristic that may
take on different values randomly from one observation to another or from a prescribed set of
values called the domain of the variables. There are two types of random variable namely;

1. Qualitative Random Variables: yields categorical or non-numerical data. Example: Blood


group, hair colour etc

2. Quantitative Random Variables: yield numerical data and further subdivided into discrete
random variable and continuous random variable.

i) Discrete Random Variable: This assumes only a finite or countable number of distinct
values. Example: the sum of the numbers showing on two dice or number of children
in a family. It can assume one of the 2 possible values is 2,3,4
ii) Continuous Random Variable: This assumes infinite number of values. It is
numerical responses which arise from a measuring process. Example: the length of time
elapsing between the arrival of successive voters at a polling booth, age, weight etc

Probability Distribution Functions

A function is a rule which assigns to every element, x of a set X, a unique element, y, in


another set Y.

If we represent this rule by f, then f maps X into Y i.e.

f:X Y, and for x∈X, or f (x) = y ∈ Y ∀ x ∈ X. (f(x) is equal to y for all x in X)

f(x) represents the image of x thus, for some y∈ Y, y = f(x). the element y is the image of
x and y is called the dependent variable while the object, X called the independent variable
is also called the domain of f.

Domain: The domain of a function f, is the set of all values of x, for which f is defined or
for which f assigns image. Thus, the set X is called the domain.
Co-domain (Image set): Co-domain of f is the set Y, of all possible images of X.

Range: The set of all images of elements of X is called the range of f. it is always a subset
of the co-domain.

Probability Distribution Function: This is defined as any rule or mathematical


expression which assigns probabilities to each of the possible values of a random variable.
It is divided into discrete (for discrete random variables) and continuous probability
distribution (for continuous random variables).

Probability Mass Function: It is a mathematical expression or function f(x) representing


a discrete probability distribution.

Probability Density Function: It is a mathematical expression or function f(x)


representing a continuous probability distribution.

Discrete Probability Distribution: A mathematical expression or function f(x)


representing a discrete probability distribution is called Probability Mass Function

𝑓 (𝑥 ) = ∑ 𝑓 (𝑥 )

Conditions for Discrete PDF

i) 𝑓 (𝑥𝑖 ) > 0, for i = 1,2,3,…


ii) 𝑓 (𝑥𝑖 ) = 0, for x = xi ; i = 1,2,3,…
iii) ∑ 𝑓(𝑥) = 1

Continuous Probability Distribution Function


𝑏
P(a ≤ x ≤ b) = ∫𝑎 𝑓(𝑥𝑖 ) 𝑑𝑥

Conditions for Continuous PDF

i) 𝑓 (𝑥𝑖 ) ≥ 0
𝑏
ii) ∫𝑎 𝑓(𝑥𝑖 ) 𝑑𝑥 = 1
𝑏
iii) ∫𝑎 𝑓(𝑥𝑖 ) 𝑑𝑥 = P(a ≤ x ≤ b) = b – a
Example: A random variable x has a pdf of the form f(x) = K(8 – x), for x = 0,1,2,3,4,5

i) Find the value of the constant K


ii) Show that f(x) is a pdf

Solution:

1
i) K=
33

X 0 1 2 3 4 5
f(x) 8 7 6 5 4 3
33 33 33 33 33 33

ii) Let x be a random variable with a Poisson distribution P(x,𝜆)

E(x) =𝜆

𝑘
= ∑𝑥=0 𝑥. P(x, 𝜆)

𝑘
𝜆𝑥 ⅇ−𝜆
=∑ 𝑥.
𝑥=0 𝑥!

We drop the term k = 0 and factor out 𝜆 from each term

𝑘
𝜆𝑥−1 ⅇ−𝜆
E(x) = 𝜆∑
𝑥=0 (𝑥−1)!

Let S = x – 1, as x runs through the value 1 to k, the values 0 to k


𝑛
𝜆𝑠 ⅇ−𝜆
E(x) = 𝜆∑
𝑠=0 𝑆!

𝑛
𝜆𝑠 ⅇ−𝜆 𝑘
Where, ∑ = ∑𝑠=0 P(s, 𝜆) = 1
𝑠=0 𝑆!

Therefore, E(x) = 𝜆 .1 = 𝜆

Var (x) of a Poisson distribution is given by 𝜆


Var(x) = E(x2) – [E(x)]2

𝑘
𝑘 𝜆𝑥 ⅇ−𝜆
E(x2) = ∑𝑥=0 𝑥 2 . P(x, 𝜆) =∑ 𝑥2.
𝑥=0 𝑥!

𝑘
𝜆𝑥−1 ⅇ−𝜆
= 𝜆∑
𝑥=0 (𝑥−1)!

Let S = x-1 then


𝑛
𝜆𝑠 ⅇ−𝜆
= 𝜆∑ (𝑠 + 1)
𝑠=0 𝑆!

𝑘
= ∑𝑠=0(s + 1) P(s, 𝜆)

Breading up the sum above into 2 sum, we have;

𝑘 𝑘
E(x2) = 𝜆 ∑𝑠=0 S P(S, 𝜆) + 𝜆 ∑𝑠=0 P(S, 𝜆)

= 𝜆 (𝜆 ) + 𝜆 (1)

= 𝜆2 + 𝜆

Var (x) = E(x2) – [E(x)]2

Var (x) = 𝜆2 + 𝜆 - 𝜆2

Var (x) = 𝜆

Example

X 0 1 2 3 4 5 6 7
f(x) 0 C 2C 2C 3C C2 2C2 7C2 + C

Recall that ∑ 𝑓(𝑥) = 1

Therefore,

0 + C + 2C + 2C + 3C + C2 + 2C2 + 7C2 + C = 1
10C2 + 9C = 1

Using the Almighty Formular

−𝑏±√𝑏2 −4𝑎𝑐
𝑥=
2𝑎

−9±√92 −4(10)(−1)
C=
2(10)

−9±√81+40 −9±11
C= =
20 20

1
C= or C = -1
10

i) pdf of A is given as

X 0 1 2 3 4 5 6 7
f(x) 0 1 2 2 3 1 2 17
10 10 10 10 100 100 100

1 2 2 3 1 2 17
∑ 𝑓(𝑥) = + + + + + + =1
10 10 10 10 100 100 100

ii) E(cx) = c[E(x)]

= C ∑ 𝑥𝑓 (𝑥)

1 1 2 2 3 1 2 17
= 10 [(0.0) + (1. 10) + (2. 10) + (3. 10) + (4. 10) + (5. 100) + (6. 100) + (7. 100)]

366
= 0.1 x 100

=0.1 x 3.66

= 0.366

iii) Var(cx) = C2Var(x)

= C2 [E(x2) – (E(x))2]
1 2 2 3 1 2 17
E(x2) = [(02 . 0) + (12 . 10) + (22 . 10) + (32 . 10) + (42 . 10) + (52 . 100) + (62 . 100) + (72 . 100)]

1 8 18 48 25 72 833
= 0 + 10 + + 10 + 10 + 100 + 100 + 100
10

1680
= = 16.8
100

Var(x) = [E(x2) – (E(x))2]

= 16.8 – (3.66)2

=16.8 – 13.4 = 3.4

Therefore,

Var(cx) = C2Var(x)

= (0.1)2 . 3.4

=0.01 x 3.4

= 0.034

∑ 𝑥𝑦
4(A) 𝛾𝑜 = where xi = x -𝑥̅ and y = y -𝑦̅
√(∑ 𝑥 2 )(∑ 𝑦 2 )

𝑁(𝑁+1) 𝑁(𝑁+1)(2𝑁+1)
∑𝑥 = ; ∑ 𝑥2 =
2 6

But

∑ 𝑥= ∑(𝑥 − 𝑥̅ )

∑ 𝑥2 = ∑(𝑥 − 𝑥̅ )2

= ∑ 𝑥 2 − 𝑁𝑥̅ 2

(∑ 𝑥)2
= ∑ 𝑥2 − 𝑁

Therefore,

𝑁(𝑁+1)(2𝑁+1) 𝑁(𝑁+1)
∑ 𝑥2 = −
6 4
𝑁3− 𝑁
∑ 𝑥2 =
12

Example 2

Let x represent the number of boys selected and 3-x of being selected. The probability distribution
of x is given by

4)( 3 )
(𝑥
3−𝑥
F(x) = 7 for x = 0,1,2,3
(3)

Show that the distribution is a pdf.

Solution

x 0 1 2 3
f(x) 1 12 18 4
35 35 35 35

1 12 18 4
∑ 𝑓(𝑥) = + + + =1
35 35 35 35

(4 3
0)(3−0) (4 3
0)(3) 1𝑥1 1
F(0) = = = =
(7
3) (7
3)
35 35

Example 3

Let x be a random variable of the discrete type with sample space A = {x, x = 0,1,2,3,4}, where;

4! 1 4
f(x) =
𝑥!(4−𝑥)!
( 2)

i) Calculate the pdf


ii) X = 0,1,2
Solution

4! 1 4 4! 1 4 4! 1 4 4! 1 4
i) ∑ 𝑓(𝑥) =
0!(4−0)!
(2) + 1!(4−1)! (2) + 2!(4−2)! (2) + 3!(4−3)! (2) +

4! 1 4
4!(4−4)!
(2)

1 4 6 4 1
∑ 𝑓(𝑥) = + 16 + + 16 + 16 = 1
16 16

4! 1 4 4! 1 4 4! 1 4
ii) ∑ 𝑓(𝑥 = 0,1,2) =
0!(4−0)!
(2) + 1!(4−1)! (2) + 2!(4−2)! (2)
1 4 6 11
∑ 𝑓(𝑥) = + 16 + = 16
16 16

Continuous Density Function (CDF)

Example 4

Given a Probability Density Function

𝑥2
𝑓(𝑥) = { 3 -1 < x < 2
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

i) Find the CDF of x


ii) Find P(0 < x ≤ 1)

Solution

i) CDF of x = 𝑓(𝑥) = ∫−∞ 𝑓(𝑡) 𝑑𝑡 ; for -1 < x < 2

𝑥
𝑥 𝑡2 𝑡3 𝑥 3 +1
𝑓(𝑥) = ∫−1 𝑑𝑡 = 9 | =
3 −1 9

0; 𝑥 ≤ −1
𝑥 3 +1
𝑓(𝑥) = {( ) ; 1− < 𝑥 < 2
9
1; 𝑥 ≥2

ii) P (0 < x ≤ 1) = F(b) – F(a) = f(1) – f(0)

1+1 0+1 2 1 1
= - =9+9=9
9 9
Exercise

Let X be a continuous random variable with the following distribution

𝑘𝑥
𝑓(𝑥) = { 0≤x≤5
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

Evaluate P (1 < x < 2)

Probability Distribution

This is the probability that is assumed by the value of a random variable. It is denoted by P(x) or
F(x). If P(x) is the probability that x is the value of the random variable we can be sure that,

∑ 𝑃 (𝑋 ) = 1

Example

Let X be the number of heads from the experiment of tossing a coin four times. Find the probability
distribution of X.

Solution: Sample Space (S) =

HH HT TH TT
HH HHHH HHHT HHTH HHTT
HT HTHH HTHT HTTH HTTT
TH THHH THHT THTH THTT
TT TTHH TTHT TTTH TTTT

SS HHHH HHHT HHTH HHTT HTHH HTHT HTTH HTTT


No of 4 3 3 2 3 2 2 1
Head

SS THHH THHT THTH THTT TTHH TTHT TTTH TTTT


No of Head 3 2 2 1 2 1 1 0
X 0 1 2 3 4
F(X) 1 4 6 4 1
16 16 16 16 16

1 4 6 4 1
∑ 𝑓 (𝑥 ) = + + + + =1
16 16 16 16 16

Probability Mass Function

Bernoulli Distribution

If a random variable X must assume one of two possible values, 1 or 0 (Success or failure)
then X has a Bernoulli distribution and is referred to as a Bernoulli random variable if and
only if its probability distribution is given by

F(x,p) = (𝑛𝑥)𝑝 𝑥 𝑞𝑛−𝑥 for x = 0,1

Thus f(1,p) = p and f(0,p) = 1 – p

Note p is the only parameter called the probability of success.

Binomial Distribution

This distribution is associated with repeated trails of experiment that always results in only
one or two possible mutually exclusive and independent outcomes. It is generally grouped
into two classes: Success and Failure.

Definition: Suppose we have a sequence of n independence trails in which there are x


successes and (n – x) failures, then, this experiment can be represented with the binomial
distribution with the following mass function:
𝑛
P(x = 1) = P(x) = b(x, n, p) = ∑ (𝑛𝑥)𝑝 𝑥 𝑞𝑛−𝑥 x = 0, 1, 2, 3, …
𝑥=0

Where
P = probability of success

q = 1 – p = probability of failure; note: p + q = 1

The mean of the binomial distribution is np with its variance as npq (Proof in class)

Example:

Suppose that it is known that 10% of the components produced on a certain machine are
defective. If 4 components are randomly selected. Find the probability that,

i. Exactly one component is defective


ii. At least one component is defective
iii. Two or three defectives

Solution:

n = 4, p = 0.1, q = 1 – p = 0.9

i) P (X = 1) = (𝑛𝑥)𝑝 𝑥 𝑞𝑛−𝑥
Then
P (X = 1) = (41)(0.1)1 (0.9)4−1 = 0.2916
ii) P (X ≥ 1) = P (X=1 or 2 or 3)
= P (X ≥ 1) + P (X ≥ 2) + P (X ≥ 3) + P (X ≥ 4)
= 1 – P (X < 1)
= 1 – P (X = 0)
= 1 - (40)(0.1)0 (0.9)4−0 = 0.3439
iii) P (X = 2 or 3) = P (X = 2) + P (X = 3)
= (42)(0.1)2 (0.9)4−2 + (43)(0.1)3 (0.9)4−3
= 0.0486 + 0.0036
= 0.0522

Geometric Distribution
If we are interested in the number of trails before we make the first success, the geometric
distribution becomes the best tool. Example, if we are interested in the number of
customers contacted before the first sale is made, the number of children before the first
male child.

Definition: if p is the constant probability of success throughout the n experiment, and q is


the constant probability of failure given that n is the number of trials on which the first
success occurs, then the geometric distribution is given as

P(X = n) = P(1-P)n-1 n = 1,2,3,…

= Pqn-1
1 1−𝑃
The mean of geometric distribution is and variance is 𝑝 and 𝑃2
respectively.

Example: Suppose it is known that 30% of the applicants for a certain research job have
advanced training in statistical computing. Applicants were interviewed sequentially and
are selected at random from the pool. Find the probability that the first applicant having
advanced training in statistical computing is found on the fifth interview.

Solution

n = 5, p = 0.3, q = 0.7

P(X = 5) = (0.3)(0.7)5-1

= 0.072

Poisson Distribution

Poisson distribution is the widely used model for describing the frequency of occurrence
of rare events such as number of industrial accidents in a particular industry per week, the
arrival of customers at a pharmaceutical store or restaurant within a specific time period.
If X is the number of times of occurrence in a specified interval of time or space, then the
probability distribution of X is given by

𝜆𝑥 ⅇ−𝜆
P(X = x) = , 𝜆 > 0, x = 0,1,2,…
𝑥!

Where 𝜆 = average number of times the random event occurs in the time period.

The mean and variance of a Poisson distribution are equal. i.e. Mean = Variance = 𝜆

Example: the average number of traffic accidents that occurs at a certain road junction is a
major city in Abuja on a week day between 3.00pm and 4.00pm is 0.8 per hour. Find the
probability of

i) 3 traffic accidents
ii) At least 3 but less than 5 traffic accidents
iii) No accident between 3.00pm and 4.00pm next Wednesday

Solution

𝜆𝑥 ⅇ−𝜆 0.83 ⅇ −0.8


i) P(X = 3) = = = 0.0383
𝑥! 3!

ii) P(At least 3 but less than 5 traffic accidents) = P(3 ≤ X ≤ 5)

= P(x = 3) + P(x = 4) + P(x = 5)

= 𝑒 −0.8 (0.0853 + 0.01707 + 0.000434)

= 0.449 (0.1028) = 0.0462

0.80 ⅇ −0.8
iii) P(no accident) = P(X = 0) =
0!

= 𝑒 −0.8 = 0.449

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy