0% found this document useful (0 votes)
46 views206 pages

Introduction to Probability and Statistics

Uploaded by

rzorojuno
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views206 pages

Introduction to Probability and Statistics

Uploaded by

rzorojuno
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 206

Introduction to Probability and Statistics

A. Sudhir Kumar
December 16, 2024

Contents
Syllabus and Lecture Plan 2

I. Probability 7
Algebra of Sets and Counting Methods . . . . . . . . . . . . . . . . . . 7
Basic Concepts in Probability . . . . . . . . . . . . . . . . . . . . . . . 23
Probability of an Event . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Theorems in Probability . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Bayes’ Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

II. Random Variable 76


Random Variable and it’s Probability Distributions . . . . . . . . . . . 76
Joint Probability Distributions . . . . . . . . . . . . . . . . . . . . . . 91

III. Mathematical Expectation 108


Expectation of Random variables and it’s Properties . . . . . . . . . . 108
Correlation Coefficient and its Properties . . . . . . . . . . . . . . . . 123
Moment Generating Function and its Properties . . . . . . . . . . . . 131
Probability Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . 145

IV. Special Probability Distributions 156


Discrete Probability Distributions . . . . . . . . . . . . . . . . . . . . 156
Continuous Probability Distributions . . . . . . . . . . . . . . . . . . . 173
Central Limit Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . 193

V. Sampling Theory 206

1
Syllabus and Lecture Plan

Syllabus and Lecture Plan

2
Syllabus and Lecture Plan

Course code Course Name Course L-T-P Credits


Category

MA2201 Introduction to Probability and Statistics BSC 2-1-0 3


(ME & CIVIL)

COURSE LEARNING OBJECTIVES


Course Learning Objectives:
1. Providing students with a formal treatment of probability theory.
2. Equipping students with essential tools for statistical analysis.
3. Fostering understanding through real-world statistical applications.
4. Develop skills in presenting quantitative data using appropriate diagrams,
tabulations.
5. Use appropriate statistical methods in the analysis of simple datasets.
6. Instill the belief that Statistics is important for scientific research.

COURSE CONTENT
Unit – I (7 Contact hours)
PROBABILITY: Probability introduction through Sets and Relative Frequency,
Experiments and Sample Spaces, Discrete and Continuous Sample Spaces, Events,
Probability Definitions and Axioms, Mathematical Model of Experiments,
Probability as a Relative Frequency, Joint probability, Conditional Probability,
Total Probability, Bayes’ Theorem and Independent Events.

Unit – II (6 Contact hours)


RANDOM VARIABLE : Definition of random variable, discrete and continuous
random variables, Probability mass function and density function. Bivariate random
variable, Joint probability mass function and joint probability density function.
independent random variables.

Unit-III (8 Contact hours )


Mathematical Expectation: Mathematical expectation of a random variable.
Mean, variance, correlations coefficient its properties. Moment generating function
its properties. Chebyshev's inequality Markov’s inequality.

Unit-IV (10 Contact hours)


SPECIAL PROBABILITY DISTRIBUTIONS: Bernoulli, Binomial, Poisson,
Negative Binomial, Geometric and hyper geometric distributions (Find their mean,
variance and problems). Continuous distributions: Uniform, Exponential, Normal,
Beta and Gamma distributions. Central limit theorem.

Unit – V (8Contact hours)


SAMPLING THEORY: Definition of population, sampling, statistics and
parameters. Types of sampling, sampling distributions (student-t distribution, chi-
square distribution and F distribution). Sampling distribution of sample mean,
difference of mean, proportion and difference of proportion, variance and
comparison of variance.
3
Syllabus and Lecture Plan

Unit-VI (6 Contact hours)


Theory Estimation: Point estimator and interval estimator for population
mean, difference of mean, proportion and difference of proportion, variance
and comparison of variance.

LEARNING RESOURCES
TEXT BOOK
William W. Hines and Douglas C. Montgomery, ‘Probability and Statistics in
Engineering’, Willy Publications, 4th Edition.

REFERENCE BOOKS
i) Sheldon Ross, ‘A First Course in Probability’, Pearson Publications, 9th Edition.
ii) Athanasios Papoulis and S. Unnikrishna Pillai, ’Probability, Random Variables
and Stochastic Processes’, TMH, 4th Edition,.

WEB RESOURCES
1. https://nptel.ac.in/courses/117105085/
2. https://nptel.ac.in/courses/111106112/
3. https://nptel.ac.in/courses/111102111/
4. RGUKT Course Content
COURSE OUTCOMES: At the end of the course, the student will be able to
CO 1 Apply Probability theory via Baye’s Rule.

CO 2 Describe the properties of Discrete and Continuous distributions.

CO 3 Apply problem-solving techniques to solving real-world events.

CO 4 Apply selected probability distributions to solve problems.

CO 5 Develop problem-solving techniques needed to accurately calculate


probabilities.

CO 6 Interpret and clearly present output from statistical analysis.

Assessment Method for Theory courses only


Course Nature Theory

Assessment Method

Assessment Tool Weekly Monthly tests End Semester Test Total


tests

Weightage (%) 10% 30% 60% 100%

4
Syllabus and Lecture Plan
Lecture Plan:
Unit Topic
Permutations and Combinations
Probability introduction through Sets
Probability Definitions and Axioms,
Mathematical definition of probability
Joint Probability, Conditional Probability
Bayes’ Theorem
I Bayes’ Theorem

Random Variable-1(Discrete R V)
Random Variable-2(Continuous RV)
Bivariate random variable-1(B.D.R.V)
Bivariate random variable-2 (B.C.R.V)
II Marginal and Conditional probability function.

Mathematical Expectation -1
Mathematical Expectation -2
Correlation Coefficient
Moment generating function
III Probability Inequalities

Discrete Probability Distributions-1


Discrete Probability Distributions-2
Discrete Probability Distributions-3
Discrete Probability Distributions-4
Continuous Probability Distributions-1
Continuous Probability Distributions-2
Continuous Probability Distributions-3
Continuous Probability Distributions-4
Central Limit Theorem
IV Central Limit Theorem

Sample. Types of sample


Sampling -Distributions (t, F and Chi-square)
Theory of estimation(Point estimation and interval estimation)
Sampling distribution for single parameters (mean, proportion, variance)
Sampling distribution for comparison of two parameters (mean, proportion,
variance)
Confidence interval of mean and difference of mean
V Confidence interval for proportion and difference of proportion

5
Syllabus and Lecture Plan
Confidence interval for the variance.

Statistical Hypotheses: General Concepts


Type-1 error and type-2 error, evaluation of p values
Test for single mean
Test for proportion
Test for variance
Test for two populations means
Test for two proportions
VI Test for two variances

6
Probability

UNIT-I: Probability

7
Probability Algebra of Sets and Counting Methods

Unit-1
Probability
1.1
Algebra of Sets and Counting Methods
The algebra of sets and counting methods are useful in understanding the basic
concepts of probability. These concepts are briefly reviewed from the point of
view of probability.

Sets and Elements of sets: The fundamental concept in the study of the
probability is the set.
A set is a well defined collection of objects and denoted by upper case English
letters. The objects in a set are known as elements and denoted by lower case
letters. A set can be written in two ways. Firstly, if the set has a finite number of
elements, we may list the elements, separated by commas and enclosed in
brackets. For example, a set with elements and , it may be
written as

Secondly, the set may be described by a statement or a rule. Then may be


written as

is a natural number less than or equal to

If is an element of the set , we write . If is not a element of the set ,


then we write .

Equal Sets: Two sets and are said to be equal or identical if they have exactly
the same elements and we write as

Subset: If every element of the set belong to the set , , if ,


then we say that is a subset of and we write ( is contained in ) or
( contians ). If and , then .
8
Probability Algebra of Sets and Counting Methods

Null set: A null or an empty set is one which does not contain any element at all
and denoted by .

Note:

1. Every set is a subset it self


2. An empty set is a subset of every set.
3. A set containing only one elements is conceptually different from the
element itself .
4. In all applications of set theory, especially in probability theory, we shall
have a fixed set (say), given in advance and we shall be concerned only
with subsets of . This set is referred to universal set.
1) Union or sum:

n
Ai
i 1

2) Intersection or Product:

n
Ai
i 1

If , then we say that and are disjoint sets.

3) Relative Difference:
4) Complement:

Algebra of Sets:

If and are subsets of a universal set , then the following laws hold:

Commutative laws: ,

Associative laws: ,

9
Probability Algebra of Sets and Counting Methods

Distributive laws:

Complementary laws: , , ,

Difference laws: ,

, ,

De – Morgan’s laws:

,
n n n n
Ai  Ai and Ai  Ai
i 1 i 1 i 1 i 1

Involution law:

Idempotent law: ,

Class of Sets: A group of sets will be termed as a class of sets. We shall define
some useful types of classes used in probability.

Field: A field (or algebra) is a non – empty class of sets which is closed under
the formation of finite unions and under complementation. Thus,

(i) and
(ii)

– Field: A – field (or – algebra) is a non – empty class of sets that is


closed under the formation of countable union and complementation. Thus,
n
(i) Ai
i 1

(ii)

10
Probability Algebra of Sets and Counting Methods

Fundamental Principle of Addition (Principle of inclusion- exclusion)


Let be sets and the elements in each sets are different. Then the
number of ways of selecting an element from or or … is given by

 
Ai    n  Ai   n  Ai  Aj    n  Ai  A j  Ak   ...
m m m m m m m
n
 i 1  i 1 i 1 j 1 i 1 j 1 k 1

i j i jk
 m

....   1
m 1
n Ai 
 i 1 

where represents the number of elements in .

Note:

1.
2.

Example 1: Find the number of ways of selecting

(i) A diamond or heart


(ii) An ace or a spade

from a pack of cards

Solution: Let be the set of diamonds, be the set of hearts, be the set of
aces and set of spades.

(i) Here , and , are disjoint.


Hence and

(ii) Here and . Note that and are not disjoint and
. Hence

11
Probability Algebra of Sets and Counting Methods

Note: If are pair-wise disjoint sets, then there will be no common


elements to these sets and hence

m  m
n  Ai    n  Ai 
 i 1  i 1

Fundamental Principle of Multiplication (Product rule)


Let be sets and the elements in each set are distinct. Then the
number of ways of selecting first object from , second object from
object from in succession is given by

Example 2: A man has different shirts and different pants. In how many
different ways, he can be dressed?

Solution: Choosing a dress means selection of one shirt and one pant. The total
number of ways of choosing a dress is

Example 3: Two dice are thrown.

(i) How many different outcomes are there?


(ii) How many different outcomes with distinct values (no doubles)?

Solution: On each die, we may get the number or or or or or .


One outcome means one number on first die and another number on second die.

(i) Number of different out comes


(ii) Number of different out comes

12
Probability Algebra of Sets and Counting Methods

Permutations
A permutation is an arrangement or an ordered selection of objects. There is
importance to the order of objects in a permutation.

1) The number of permutations of different objects taken at a time is

nP when repetition of objects is not allowed.


r

The number of permutations of different objects taken at a time is

when repetition of objects is allowed any number of times.

2) The number of permutations of different objects taken all at a time when


repetition of objects is not allowed is
3) If there are objects, of type , of type , of type where
, then the number of permutations of these
objects taken all at a time is
n!
n1 ! n2 ! n3 ! ....nk !
4) The number of permutations of different objects taken at a time without
repetitions in which
(i) particular objects will always occur is n  k P rP
r k k
(ii) particular objects will never occur is  n  s  P
r
(iii) particular objects will always occur and particular objects will
never occur is  n  k  s  P rP
r k k

Combinations
A combination is an unordered selection or subset of the objects. There is no
importance to the order of the objects in a combination.

1) The number of combinations of different objects taken at a time is


nP
denoted by nC and nC  r when repetition of objects is not allowed.
r r r!

13
Probability Algebra of Sets and Counting Methods

The number of combinations of different objects taken at a time is


 n  r  1C when repetition of objects is allowed.
r
2) The number of combinations of different objects taken at a time without
repetitions in which
(i) particular objects will always occur is  n  k C
r k
(ii) particular objects will never occur is  n  s C
r
(iii) particuarl objects will always occur and particular objects will never
occur is  n  k  s C
r k
3) The number of combinations of different objects taken any number (one or
more) at a time when repetitions are not allowed is

4) The total number of combinations of objects taken any


number at a time when objects are of type , are of type are of
type

5) The total number of combinations of objects taken


any number at a time when objects are of type , are of type
are of type =

Circular Permutations
An arrangement of objects arranged in a circle is known as a circular permutation.

1) The number of circular permutations of different objects taken all at a time


is
2) The number of circular permutations of different objects taken all at a time
when clockwise and anticlockwise arrangements are considered the same (as
in Necklace, Garland) is

14
Probability Algebra of Sets and Counting Methods

3) The number of circular permutations of different objects taken at a time


nP
is r
r
4) The number of circular permutations of different objects taken at a time
when no distinction is made between clockwise and anticlockwise direction
nP
r .
r

Distribution or Occupancy Problems


The number of ways, objects can be distributed among different boxes,
depends upon the fact: how many objects are permitted to be in one box and
whether the objects are different or not. Problems involving the distribution of
objects among boxes are called distribution or occupancy problems.

The distribution of different objects corresponds to permutations and distribution


of identical objects corresponds to combinations.

Distribution of Different Objects:

1. The number of ways of distributing different objects into different boxes if


(i) no restriction is placed on the number of objects permitted in a box is .
(ii) a particular box contains exactly objects is .
(iii) at most one object is permitted into a box is .
2. The number of ways of distributing objects to the box for
such that is given by
r!
r1 ! r2 ! r3 ! ...rn !

Distribution of Identical objects


1) The number of ways of distributing identical objects into different boxes if
(i) no restriction is placed on the number of objects permitted per box is
(Bose – Einstein formula)

15
Probability Algebra of Sets and Counting Methods

(ii) A particular box contains exactly objects is

 n 1   r  k  1C   n  r  k  2 C
r k r

(iii) atmost one object is permitted per box is nC  r  n 


r
(Fermi – Dirac formula)

Example 4

S.No Objects Arrangement Problem Answer


1 boys and girls Row No two girls together
2 boys and girls Circle No two girls together
3 boys and girls row No two girls together
4 boys and girls row Boys and girls
alternate
5 boys and girls circle No two girls together
6 boys and girls circle Boys and girls
alternate
7 signs and signs row No two – together
8 signs and signs circle No two – together
9 signs and signs row No two – together
10 signs and signs row and – alternate
11 signs and signs circle No two – together
12 signs and signs circle and – alternate

16
Probability Algebra of Sets and Counting Methods

Example 5:

(i) Find the number of -letter words that can be formed using the letters of
the word EQUATION.
(ii) How many of these words begin with ?
(iii) How many end with ?
(iv) How many begin with and end with ?

Solution: The word EQUATION has distinct letters.

(i) Number of letter words is 8 P


4
(ii) The first letter is fixed ( . The remaining three letters are to be
filled with letters. Thus, the number of letter words begin with
is 7 P
3
(iii) . No of words ending with is 7 P
3
(iv) No of words begin with and end with is 6 P
2

Example 6: Find the number of letter words that can be formed using the

letters of the word MIXTURE which

(i) contain the letter


(ii) do not contain the letter

Solution: Take blanks . We have to fill up blanks using the


letters of the word.

(i) First we put in one of the blanks. This can be done in ways. Now
we can fill the remaining palces with the remaining letters in
ways. Thus, the number of letter words containing the letter are

17
Probability Algebra of Sets and Counting Methods

(ii) Leaving the letter , we fill the blanks with the remaining letters in
ways. Thus, the number of letter words that do not contain the
letter is

Example 7: Find all digit numbers that can be formed using the digits
when repetition is allowed.

Solution: The number of digit number with repetitions is

Example 8: Find the number of ways of arranging the letters of the word
SPECIFIC. In how many of them

(i) the two Cs come together?


(ii) the two s do not come together?

Solution: The word SPECIFIC has letters in which there are s and C’s.
Hence, they can be arranged in
8!
ways
2! 2!

(i) Treat two C’s as one unit. Then we have letters in which two
letters ( s) are alike.
Thus, the no. of arrangement
(ii) Keeping the two s aside, arrange the remaining letters can be arragned
in ways. Among these letters we find gaps as shown below.

7P
The two s can be arranged in these gaps in 2
2!

7P
6!
Hence, the number of required arrangements is  2
2! 2!

Example 9: Find the number of ways of selecting boys and girls from a group
of boys and girls is

18
Probability Algebra of Sets and Counting Methods

Solution:

Example 10: Find the number of ways of forming a committee of members out
of boys and girls such that there is at least one girl in the committee.

Solution:

Derangements and Matches


If objects numbered are distributed at random in places also
numbered a match is said to occur. If an object occupies the place
corresponding to its number, the number of permutations in which no match
occurs is

 1 1 1 n 1 
Dn  n! 1     ....   1 
 1! 2 ! 3 ! n! 

This is also known as derangement.

The number of permutations of objects in which exactly matches occur is

19
Probability Algebra of Sets and Counting Methods
1.1. Algebra of Sets and Counting Methods
Exercise:

1. The letters of the word MISSISSIPPI are arranged. Find

a. All possible arrangements.

b. All arrangements in which 4S’s come together.

c. All arrangements in which 4S’s do not come together.

d. All arrangements in which 4S’s and 4I’s come together.

2. If A and B stands in a line along with 10 other persons, then find the
number of ways in which there are three persons between A and B.

3. If A and B stands in a circle along with 10 other persons, then find the
number of ways in which there are three persons between A and B.

4. The letters of the word FLOWER are taken 4 at a time and arranged in all
possible ways. Find the number of arrangements that
a. Begins with F and ends with R .
b. Contain the letter E.
5. Find the number of ways in which the letters of the word HOSTEL can be
arranged so that
a. The vowels may not be separated.
b. The vowels occupy even places.

6. All the letters of the word EAMCET are arranged in all possible ways. Find
the number of arrangements in which no two vowels are adjacent.

7. Find the number of arrangements that can be made by taking all the letters
of the word ALGEBRA.

20
Probability Algebra of Sets and Counting Methods

8. Find the number of arrangements that can be made by taking all the letters
of the word MATHEMATICS such that
a. 2M’s come together.
b. 2M’s do not come together.

9. There are 5 maths, 6 physics and 8 chemistry books. How many ways are
there to pick
a. Two books not both on the same subject.
b. Any two books.

10. How many ways are there to form a 3 letter words using the letters A, B, C,
D, E, F.
a. with repetition of letters.
b. without repetition of any letter.
c. without repetition that contain the letter E.
d. with repetition that contain E.

11. Find the number of arrangements which can be made using all the letters
of the word LAUGH if the vowels are adjacent.

12.If all permutations of the letters of the word AGAIN are arranged as in
dictionary, find the 50th word.

13. Find the number of ways in which any four letters can be (i) arranged and
(ii) selected from the word CORGOO

14. Find the total number of (i) permutations and (ii) combinations of 4 letters
that can be made out of the letters of the word EXAMINATION.

21
Probability Algebra of Sets and Counting Methods
15. The digits 1,2,3,4 and 5 are given. Find

a. 3 digit numbers without repetitions.


b. 3 digit numbers with repetitions.
c. 3 digit odd numbers without repetitions.
d. 3 digit odd numbers with repetitions.
e. 3 digit even numbers without repetitions.
f. 3 digit even numbers with repetitions.
g. 5 digit numbers without repetitions.
h. 5 digit numbers with repetitions.

16.Find the number of 3 digit odd numbers that can be formed with digits
1,2,3,4, 5 when repetition of digits is
a. Not allowed
b. Allowed

22
Probability Basic Concepts in Probability

23
Probability Basic Concepts in Probability

1.2
Basic Concepts in Probability
Introduction to uncertainty
Every day we have been coming across statements like the ones mentioned
below:

1. Probably it will rain tonight.


2. It is quiet likely that there will be a good yield of paddy this year.
3. Probably I will get a first class in the examination.
4. India might win the cricket series against Australia
and so on.

In all the above statements some element of uncertainty or chance is involved.


A numerical measure of uncertainty is provided by a very important branch of
statistics known as Theory of Probability. In the words of Prof. Ya-Lin-Chou:
Statistics is the science of decision making with calculated risks in the face of
uncertainty.

History of Probability

The history of probability suggests that its theory developed with the study of
games of chance, such as rolling of dice, drawing a card from a pack of cards, etc.
Two French gamblers had once decided that any one person who will first get a
‘particular point’ will win the game. If the game is stopped before reaching that
point, the question is how to share the stake. This and similar other problems
were then posed by the great French mathematician Blaise Pascal, who after
consulting another great French mathematician Pierre de Fermat, gave the
solution of the problems and then laid down a strong foundation of probability.
Later on, another French mathematician, Laplace, improved the definition of
probability.

24
Probability Basic Concepts in Probability

Coins, Dice and Playing Cards: The basic concepts in probability are better
explained using coins, dice and playing cards. The knowledge of these is very
much useful in solving problems in probability.

Coin: A coin is round in shape and it has two sides. One side is known as head (H)
and the other is known as tail (T). When a coin is tossed, the side on the top is
known as the result of the toss.

Die: A die is cube in shape in which length, breadth and height are equal. It has six
faces which have same area and numbered from 1 to 6. The plural of die is dice.
When a die is thrown, the number on the top face is the result of the throw.

Pack of Cards: A pack of cards 52 cards. It is divided into four suits called spades,
clubs, hearts and diamonds. Spades and clubs are black; hearts and diamonds are
red in colour. Each suit consists of 13 cards, of which nine cards are numbered
from 2 to 10, an ace, jack, queen and king. We shuffle the cards and then take a
card from the top which is the result of selecting a card.

Basic Concepts in Probability


The following basic concepts are very important in understanding the definitions
of the probability:

Experiment: The process of making an observation or measurement and


observation about a phenomenon is known as an experiment.

Example1: Sitting in the balcony of the house and watching the movement of
clouds in the sky is an experiment.

Example2: For given values of pressure (P), measuring the corresponding values
of volume (V) of a gas and observing that 𝑃 ∙ 𝑉 = 𝑘(constant) is an experiment.
The experiments are of two types:

25
Probability Basic Concepts in Probability

Deterministic experiment: If an experiment produces the same result when it is


conducted several times under identical conditions, then the experiment is known
as determinant experiment.
All the experiments in physical and engineering sciences are deterministic.

Random Experiment: If an experiment produces different results even though it is


conducted several times under identical conditions, then the experiment is known
as random experiment. All the experiments in social sciences are random.

Trial: Conducting a random experiment once is known as a trial.

Outcome: A result of a random experiment in a trial is known as an outcome.


Outcomes are denoted by lowercase letters 𝑎, 𝑏, 𝑐, 𝑑, 𝑒, … .

Equally Likely Outcomes: Outcomes of a random experiment are said to be


equally likely if all have the same chance of occurrence. Getting a H and T in a
balanced coin are equally likely. The outcomes 1,2,3,4,5 and 6 are equally likely if
the die is a cube.

Sample space: The set of all possible outcomes of a random experiment is known
as a sample space and denoted by S.

Event: A subset of the sample space is known as an event.


The events are denoted by uppercase letters A, B, C etc.

Happening of an event: We say that an event happens (or occurs) if any one
outcome in it happens (or occurs).

Elementary Event: A singleton set consisting an outcome of a random experiment


is known as an elementary event.

Favorable outcomes: The outcomes in an event are known as favorable


outcomes or cases of that event.

Impossible Event: An event with no outcome in it is known as impossible event


and is denoted by 𝝓.

26
Probability Basic Concepts in Probability

Certain or Sure Event: An event consisting of all possible outcomes of a random


experiment is known as certain or sure event and it is same as the sample space.

Exhaustive Events: The events in a sample space are said to be exhaustive if their
union is equal to the sample space. The events 𝐴1 , 𝐴2 , … , 𝐴𝑛 in S are said to be
exhaustive if
𝑛

𝐴𝑖 = 𝑆
𝑖=1

Mutually Exclusive Events: Two or more events in the sample space are said to be
mutually exclusive if the happening of one of them precludes the happening of
the others. Mathematically two events 𝐴 and 𝐵 in S are said to be mutually
exclusive if 𝐴 ∩ 𝐵 = 𝜙.

Example3: Consider a random experiment of tossing a coin. The possible


outcomes are 𝐻 and 𝑇. Thus, the sample space is given by 𝑆 = 𝐻, 𝑇 and
𝑛 𝑆 = 2 where 𝑛 𝑆 is the total number of outcomes in 𝑆.

Example 4: Consider a random experiment of tossing two coins (or two tosses of a
coin). The sample space is given by 𝑆 = 𝐻, 𝑇 × 𝐻, 𝑇 = 𝐻𝐻, 𝐻𝑇, 𝑇𝐻, 𝑇𝑇 and
𝑛 𝑆 = 22 = 4.

Example 5: Consider a random experiment of tossing three coins (or three tosses
of a coin). The sample space is given by

𝑆 = 𝐻, 𝑇 × 𝐻, 𝑇 × 𝐻, 𝑇 = 𝐻, 𝑇 × 𝐻𝐻, 𝐻𝑇, 𝑇𝐻, 𝑇𝑇

= 𝐻𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝐻𝑇𝑇, 𝑇𝐻𝐻, 𝑇𝐻𝑇, 𝑇𝑇𝐻, 𝑇𝑇𝑇

and 𝑛 𝑆 = 23 = 8.

27
Probability Basic Concepts in Probability

Let us define some events in the sample space as below:

𝐸1 : Three heads

𝐸2 : Three tails

𝐸3 : Exactly one head

𝐸4 : Exactly two heads

𝐸5 : At least one head

𝐸6 : At least two heads

Then these events are represented by the following subsets of 𝑆:

𝐸1 = 𝐻𝐻𝐻 ;

𝐸2 = 𝑇𝑇𝑇

𝐸3 = 𝐻𝑇𝑇, 𝑇𝐻𝑇, 𝑇𝑇𝐻 ;

𝐸4 = 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝑇𝐻𝐻 ;

𝐸5 = 𝐻𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝐻𝑇𝑇, 𝑇𝐻𝐻, 𝑇𝐻𝑇, 𝑇𝑇𝐻 and


𝐸6 = 𝐻𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝑇𝐻𝐻 .

Note that 𝐸1 ∪ 𝐸2 ∪ 𝐸3 ∪ 𝐸4 = 𝑆 and hence 𝐸1 , 𝐸2 ,𝐸3 and𝐸4 are exhaustive


events in 𝑆. Further, 𝐸𝑖 ∩ 𝐸𝑗 = 𝜙, where 𝑖 ≠ 𝑗. Hence, 𝐸1 , 𝐸2 ,𝐸3 and 𝐸4 are
mutually exclusive events in 𝑆.

Note: In general, if a random experiment consists of tossing 𝑁 coins (or 𝑁 tosses


of a coin), then 𝑛 𝑆 = 2𝑁 .

Example 6: Let us consider a random experiment of throwing a die. Since we can


obtain any one of the six faces 1,2,3,4,5 and 6, the sample space is given by
𝑆 = 1,2,3,4,5,6 and 𝑛 𝑆 = 6.

Now define 𝐸1 = 1,3,5 , 𝐸2 = 2,4,6 and 𝐸3 = 3,6 . We say that 𝐸1 happens


or occurs if we get the outcome 1,3 𝑜𝑟 5. In otherwords, we say that 𝐸1 happens

28
Probability Basic Concepts in Probability

if we get an odd number. Similarly, we say that 𝐸2 happens if we get an even


number and 𝐸3 happens if we get a multiple of 3.

Since 𝐸1 , 𝐸2 and 𝐸3 are subsets of 𝑆; 𝐸1 , 𝐸2 and 𝐸3 are events in 𝑆. Since


𝐸1 ∪ 𝐸2 = 𝑆, 𝐸1 and 𝐸2 are exhaustive events in 𝑆. Since 𝐸1 ∪ 𝐸3 = 1,3,5,6 ≠ 𝑆,
𝐸1 and 𝐸3 are not exhaustive events in 𝑆. Since 𝐸1 ∩ 𝐸2 = 𝜙, 𝐸1 and 𝐸2 are
mutually exclusive events in 𝑆. Since 𝐸1 ∩ 𝐸3 = 3 , 𝐸1 and 𝐸3 are not mutually
exclusive events in 𝑆. Similarly 𝐸2 and 𝐸3 are not mutually exclusive events in 𝑆.

Example 7: In a random experiment of throwing two dice (or two throws of a die),
the sample space is given by

𝑆 = 1,2,3,4,5,6 × 1,2,3,4,5,6

1,1 , 1,2 , … , 1,6 ,

2,1 , 2,2 , … , 2,6 ,

3,1 , 3,2 , … , 3,6

4,1 , 4,2 , … , 4,6

5,1 , 5,2 , … , 5,6

6,1 , 6,2 , … , 6,6

where in the outcome 𝑎, 𝑏 , 𝑎 represents the number obtained on the first


die and 𝑏 represents the number on the second die. Obviously 𝑎, 𝑏 ≠ 𝑏, 𝑎
unless 𝑎 = 𝑏. The number of outcomes in 𝑆 is given by 𝑆 = 62 = 36.

29
Probability Basic Concepts in Probability

Let us define the following events in 𝑆.

𝐸1 : Sum of points on two dice is 5

𝐸2 : Sum of points on two dice is 6

𝐸3 : Sum of points on two dice is even

𝐸4 : Sum of points on two dice is odd

𝐸5 : Sum of points on two dice is greater than 12

𝐸6 : Sum of points on two dice is divisible by 3

𝐸7 : Sum is greater than or equal to 2 and is less than or equal to 12

Then the events 𝐸1 to 𝐸7 as subsets of 𝑆 are given below.

𝐸1 = 1,4 , 2,3 , 3,2 , 4,1 and 𝑛 𝐸1 = 4

𝐸2 = 1,5 , 2,4 , 3,3 , 4,2 , 5,1 and 𝑛 𝐸2 = 5

The sum of the points on the two dice is even if the points obtained on each die is
(i) even or (ii) odd. Thus

𝐸3 = 2,4,6 × 2,4,6 ∪ 1,3,5 × 1,3,5

2,2 , 2,4 , 2,6 , 4,2 , 4,4 , 4,6 , 6,2 , 6,4 , 6,6 , 1,1 , 1,3 , 1,5 ,

3,1 , 3,3 , 3,5 , 5,1 , 5,3 , 5,5

and 𝑛 𝐸3 = 3 × 3 + 3 × 3 = 9 + 9 = 18.

Similarly,

𝐸4 = 2,4,6 × 1,3,5 ∪ 1,3,5 × 2,4,6

2,1 , 2,3 , 2,5 , 4,1 , 4,3 , 4,5 , 6,1 , 6,3 , 6,5 , 1,2 , 1,4 , 1,6 ,

3,2 , 3,4 , 3,6 , 5,2 , 5,4 , 5,6

and 𝑛 𝐸4 = 3 × 3 + 3 × 3 = 18.

30
Probability Basic Concepts in Probability

Further, 𝐸5 = 𝜙, 𝑖. 𝑒., 𝐸5 is an impossible event and 𝐸7 = 𝑆, 𝑖. 𝑒., 𝐸7 is a certain


event. Hence 𝑛 𝐸5 = 0 and 𝑛 𝐸7 = 36.

The sum of the points on the two dice is divisible by 3 if their sum is 3,6,9 or 12.
Thus

𝐸6 = 1,2 , 2,1 , 1,5 , 2,4 , 3,3 , 4,2 , 5,1 , 3,6 , 4,5 , 5,4 , 6,3 , 6,6
and 𝑛 𝐸6 = 12.

Note: In general, if the random experiment consists of throwing of 𝑁 dice (or 𝑁


throws of a die), the number of outcomes in 𝑆 is given by 𝑛 𝑆 = 6𝑁 .

Example 8: Let us consider the random experiment of tossing a coin and a die
together. Then the sample space is given by

𝑆 = 𝐻, 𝑇 × 1,2,3,4,5,6

= 𝐻, 1 , 𝐻, 2 , 𝐻, 3 , 𝐻, 4 , 𝐻, 5 , 𝐻, 6 , 𝑇, 1 , 𝑇, 2 , 𝑇, 3 , 𝑇, 4 , 𝑇, 5 , 𝑇, 6

and 𝑛 𝑆 = 2 × 6 = 12.

Note: In the above examples 3 to 8, if the coins and dice are unbiased, the
outcomes in the sample spaces are equally likely. Normally, the coins are
balanced and hence are unbiased. If a die is a cube, then all the surfaces have the
same area and also it is unbiased.

Example 9: Let us consider the random experiment of selecting two balls


simultaneously from and urn containing 4 balls of different colours red(R),
blue(B), yellow(Y) and white(W). Then the sample space is given by

𝑆 = 𝑅𝐵, 𝑅𝑌, 𝑅𝑊, 𝐵𝑌, 𝐵𝑊, 𝑌𝑊 and 𝑛 𝑆 = 4𝐶2 = 6

Example 10: If the random experiment consists of selecting two balls one after
the other with replacement in Example 9, the sample space is given by

𝑆 = 𝑅, 𝐵, 𝑌, 𝑊 × 𝑅, 𝐵, 𝑌, 𝑊 =
𝑅𝑅, 𝑅𝐵, 𝑅𝑌, 𝑅𝑊, 𝐵𝑅, 𝐵𝐵, 𝐵𝑌, 𝐵𝑊, 𝑌𝑅, 𝑌𝐵, 𝑌𝑌, 𝑌𝑊, 𝑊𝑅, 𝑊𝐵, 𝑊𝑌, 𝑊𝑊 and
𝑛 𝑆 = 4 × 4 = 16.

31
Probability Basic Concepts in Probability

Example 11: If the random experiment consists of selecting two balls one after
the other without replacement in Example7, the sample space is given by

𝑆 = 𝑅𝐵, 𝑅𝑌, 𝑅𝑊, 𝐵𝑅, 𝐵𝑌, 𝐵𝑊, 𝑌𝑅, 𝑌𝐵, 𝑌𝑊, 𝑊𝑅, 𝑊𝐵, 𝑊𝑌 and
𝑛 𝑆 = 4 × 3 = 12.

Example12: Consider a random experiment of tossing a coin until head appears.


Its sample space is given by

𝑆 = 𝐻, 𝑇𝐻, 𝑇𝑇𝐻, 𝑇𝑇𝑇𝐻, …

where 𝑇𝑇𝐻 represents tail in first, tail in second and head in third tosses and so
on. Obviously, 𝑛 𝑆 is infinite.

Example13: Consider a random experiment of tossing a coin repeatedly until head


or tail appears twice in succession. Thus the sample space is given by

𝑆 = 𝐻𝐻, 𝑇𝑇, 𝑇𝐻𝐻, 𝐻𝑇𝑇, 𝐻𝑇𝐻𝐻, 𝑇𝐻𝑇𝑇, …

and 𝑛 𝑆 is infinite.

32
Probability Basic Concepts in Probability

1.2. Basic Concepts in Probability

Exercise:
1. A die is tossed twice and the number of dots facing up is counted and noted in
the order of occurrence. Let us define
A ∶ Total number of dots showing is even
B ∶ Both dice are even
C ∶ Number of dots in dice differ by 1
(i) Does A imply B or does B imply B?
(ii) Find A ∩ C.

2. A desk drawer contains five pens, three of which are dry.


(i) The pens are selected at random one by one until a good pen found. The
sequence of test results is noted. What is the sample space.
(ii) Suppose that only the number and not the sequence, of pens tested in
part(i) is noted. Specify the sample space.
3. Write the sample space corresponding to each of the following random
experiment.
(i) Select a ball from an urn containing balls numbered 1 to 50. Note the
number of the ball.
(ii) Select a ball from an urn containing balls numbered 1 to 4. Suppose that
balls 1 and 2 are black and balls 3 and 4 are white. Note the number and
colour of the ball you select.
(iii) Toss a coin three times and note the sequence of heads and tails.
(iv) Toss a coin four times and note the number of tails
(v) Count the number of voice packets containing only silence produced
from a group of 𝑁 speakers in a 10-mins period.
(vi) A block of information is transmitted repeatedly over a noisy channel
until an error free block arrives at the receiver. Count the number of
transmissions required.
(vii) Pick a number at random between 0 and 1.
(viii) Measure the time between two message arrivals at a message centre.

33
Probability Basic Concepts in Probability

(ix) Measure the lifetime of a given computer memory chip in a specified


environment.
(x) Pick two numbers at random between 0 and 1.

34
Probability Probability of an Event

35
Probability Probability of an Event

1.3.
Definitions of Probability
The probability of a given event is an expression of likelihood or chance of
occurrence of an event. How the number is assigned would depend on the
interpretation of the term ‘probability’. There is no general agreement about its
interpretation. However, broadly speaking, there are four different schools of
thought on the concept of probability.

Mathematical (or classical or A priori) definition of probability

Let be a sample space associated with a random experiment. Let be an event


in . We make the following assumptions on :

(i) It is discrete and finite


(ii) The outcomes in it are equally likely

Then the probability of happening (or occurrence) of the event is defined by

Note:

i) The probability of non-happening (or non-occurrence) of is given by

That is

ii) If , then . That is, probability of an impossible


event is zero.
iii) If , then . That is, probability of a certain event is one.
iv) For any event in , .
v) The odds in favour of are given by .
vi) The odds against of are given by .

36
Probability Probability of an Event

vii) If the odds in favour of are , then .


viii) If the odds against of are , then .
ix) and are counted by using methods of counting discussed in Module 1.1.

Limitations: The mathematical definition of probability breaks down in the


following cases:

(i) The outcomes in the sample space are not equally likely.
(ii) The number of outcomes in the sample space is infinite.

Statistical (or Empirical or Relative Frequency or Von Mises) Definition of


Probability

If a random experiment is performed repeatedly under identical conditions, then


the limiting value of the ratio of the number of times the event occurs to the
number of trials, as the number of trials becomes indefinitely large, is called the
probability of happening of the event, it being assumed that the limit is finite and
unique.

Symbolically, if in trials an event happens times, then the probability of


the happening of is given by

… (1.3.1)

Note:

i) Since the probability is obtained objectively by repetitive empirical


observations, it is known as Empirical Probability.
ii) The empirical probability approaches the classical probability as the number of
trials becomes indefinitely large.

Limitations of Empirical Probability

(i) If an experiment is repeated a large number of times, the experimental


conditions may not remain identical.
(ii) The limit in (1.3.1) may not attain a unique value, however large may be.

37
Probability Probability of an Event

Subjective definition of probability: In this method, probabilities are assigned to


events according to the knowledge, experience and belief about the happening of
the events. The main limitation of this definition is, it varies from person to
person.

Axiomatic Definition of Probability: Let be a sample space and let be a


-field associated with . A probability function (or measure) is a real valued
set function having domain and which satisfies the following three axioms:

1. , for every (Non-negativity)


2. , is normed (Normality)
3. If are mutually exclusive events in , then
  
P  Ai    P  Ai  ( - additive or countably additive)
 i 1  i 1

Thus, the probability function is a normed measure on (the measurable space)


is called a Probability space. This definition is useful in proving theorems
on probability.

Note: The elements of are events in .

Solved Examples using Mathematical Definition of Probability

In this section, we use mathematical definition of probability for computing


probabilities. Also we use methods of counting for counting the number of
outcomes in an event and sample space.

Example 1: A uniform die is thrown at random. Find the probability that the
number on it is (i) even (ii) odd (iii) even or multiple of 3 (iv) even and multiple
of 3 (v) greater than 4

Solution:

(i) The number of favourable cases to the event of getting an even number is
, viz., .
Required probability

38
Probability Probability of an Event

(ii) The number of favourable cases to the event of getting an odd number is 3,
viz., 1, 3, 5.
Required probability
(iii) The number of favourable cases to the event of getting even or multiple of
3 is 4, viz., 2, 3, 4, 6.
Required probability
(iv) The number of favourable cases to the event of getting even and multiple
of 3 is 1, viz., 6.
Required probability
(v) The number of favourable cases to the event of getting greater than 4 is 2,
viz., 5 and 6.
Required probability

Example 2: Four cards are drawn at random from a pack of 52 cards. Find the
probability that

(i) They are a king, a queen, a jack and an ace.


(ii) Two are kings and two are aces.
(iii) All are diamonds.
(iv) Two are red and two are black.
(v) There is one card of each suit.
(vi) There are two cards of clubs and two cards of diamonds.

Solution: Four cards can be drawn from a well shuffled pack of 52 cards in 52C4
ways, which gives the exhaustive number of cases.

(i) 1 king can be drawn out of the 4 kings is 4C1  4 ways. Similarly, 1 queen, 1 jack
and an ace can each be drawn in 4C1  4 ways. Since any one of the ways of
drawing a king can be associated with any one of the ways of drawing a queen,
a jack and an ace, the favourable number of cases are 4C1  4C1  4C1  4C1 .
C1  4C1  4C1  4C1 256
4
Hence, required probability  52
 52
C4 C4

39
Probability Probability of an Event

C2  C2
4 4
(ii) Required probability  52
C4
(iii) Since 4 cards can be drawn out of 13 cards (since there are 13 cards of
diamond in a pack of cards) in 13C4 ways,
13
C4
Required probability  52
C4
(iv) Since there are 26 red cards (of diamonds and hearts) and 26 black cards (of
spades and clubs) in a pack of cards,
26
C2  26C2
Required probability  52
C4
(v) Since, in a pack of cards there are 13 cards of each suit,
C1  13C1  13C1  13C1
13
Required probability  52
C4
C2  13C2
13
(vi) Required probability  52
C4

Example 3: What is the chance that a non-leap year should have fifty-three
Sundays?

Solution: A non-leap year consists of 365 days, 52 full weeks and one over-

day. A non-leap year will consist of 53 Sundays if this over-day is Sunday. This
over-day can be anyone of the possible outcomes:

(i) Sunday (ii) Monday (iii) Tuesday (iv) Wednesday (v) Thursday (vi) Friday (vii)
Saturday, ., 7 outcomes in all. Of these, the number of ways favourable to the
required event viz., the over-day being Sunday is 1.

Required probability

40
Probability Probability of an Event

Example 4: Find the probability that in 5 tossings, a perfect coin turns up head at
least 3 times in succession.

Solution: In 5 tossings of a coin, the sample space is:

Exhaustive number of cases .

The favourable cases for getting at least three heads in succession are :

Starting with 1st toss:

Starting with 2nd toss:

Starting with 3rd toss:

Hence, the total number of favourable cases for getting at least 3 heads in
succession are 8.

Required probability

Example 5: A bag contains 20 tickets marked with numbers 1 to 20. One ticket is
drawn at random. Find the probability that it will be a multiple of (i)2 or 5, (ii)3 or 5

Solution: One ticket can be drawn out of 20 tickets in 20C1  20 ways, which
determine the exhaustive number of cases.

(i) The number of cases favourable to getting the ticket number which is:
(a) a multiple of 2 are 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, ., 10 cases.
(b) a multiple of 5 are 5, 10, 15, 20 i.e., 4 cases

Of these, two cases viz., 10 and 20 are duplicated.

Hence the number of distinct cases favourable to getting a number which is a


multiple of 2 or 5 are: .

Required probability

41
Probability Probability of an Event

(ii) The cases favourable to getting a multiple of 3 are 3, 6, 9, 12, 15, 18 i.e., 6
cases in all and getting a multiple of 5 are 5, 10, 15, 20 i.e., 4 cases in all. Of
these, one case viz., 15 is duplicated.

Hence, the number of distinct cases favourable to getting a multiple of 3 or 5 is


.

Required probability

Example 6: An urn contains 8 white and 3 red balls. If two balls are drawn at
random, find the probability that

(i) both are white, (ii) both are red, (iii) one is of each color.

Solution: Total number of balls in the urn is . Since 2 balls can be


drawn out of 11 balls in 11C2 ways,

1110
Exhaustive number of cases  11C2   55
2

(i) If both the drawn balls are white, they must be selected out of the 8 white
8 7
balls and this can be done in 8C2   28 ways.
2
Probability that both the balls are white
(ii) If both the drawn balls are red, they must be drawn out of the 3 red balls and
this can be done in 3C2  3 ways. Hence, the probability that both the drawn
balls are red .
(iii) The number of favourable cases for drawing one white ball and one red ball is
8
C1  3C1  8  3  24

Probability that one ball is white and other is red

42
Probability Probability of an Event

Example 7: The letters of the word ‘article’ are arranged at random. Find the
probability that the vowels may occupy the even places.

Solution: The word ‘article’ contains 7 distinct letters which can be arranged
among themselves in ways. Hence exhaustive number of cases is .

In the word ‘article’ there are 3 vowels, viz., and and these are to be placed
nd th th
in, three even places, viz., 2 , 4 and 6 place. This can be done in , ways. For
each arrangement, the remaining 4 consonants can be arranged in ways.
Hence, associating these two operations, the number of favourable cases for the
vowels to occupy even places is .

Required probability

Example 8: Twenty books are placed at random in a shelf. Find the probability
that a particular pair of books shall be:

(i) Always together (ii) Never together

Solution: Since 20 books can be arranged among themselves in ways, the


exhaustive number of cases is .

(i) Let us now regard that the two particular books are tagged together so that
we shall regard them as a single book. Thus, now we have
books which can be arranged among themselves in ways. But the two
books which are fastened together can be arranged among themselves in
ways.

Hence, associating these two operations, the number of favourable cases for
getting a particular pair of books always together is .

Required probability is .
(ii) Total number of arrangement of 20 books among themselves is and the
total number of arrangements that a particular pair of books will always be
together is , [See part (i)]. Hence, the number of arrangements in which
a particular pair of books is never together is

43
Probability Probability of an Event

Required probability

Aliter: [A particular pair of books shall never be together]

[A particular pair of books is always together] .

Example 9: persons are seated on chairs at a round table. Find the


probability that two specified persons are sitting next to each other.

Solution: The persons can be seated in chairs at a round table in


ways, which gives the exhaustive number of cases.

If two specified persons, say, and sit together, then regarding and fixed
together, we get persons in all, who can be seated at a round table in
ways. Further, since and can interchange their positions in ways,
total number of favourable cases of getting and together is .
Hence, the required probability is:

Aliter: Let us suppose that of the persons, two persons, say, and are to be
seated together at a round table. After one of these two persons, say occupies
the chair, the other person can occupy any one of the remaining chairs.
Out of these seats, the number of seats favourable to making sit next
to is 2 (since can sit on either side of ). Hence the required probability is .

Example 10: In a village of 21 inhabitants, a person tells a rumour to a second


person, who in turn repeats it to a third person, etc. at each step the recipient of
the rumour is chosen at random from the 20 people available. Find the
probability that the rumour will be told 10 times without:

(i) returning to the originator ; (ii) being repeated to any person

Solution: Since any person can tell the rumour to any one of the remaining
people in ways, the exhaustive number of cases that the rumour
will be told 10 times is .

44
Probability Probability of an Event

(i) Let us define the event :

The rumour will be told 10 times without returning to the originator.

The originator can tell the rumour to any one of the remaining 20 persons in 20
ways, and each of the recipients of the rumour can tell it to any of the
remaining persons (without returning it to the originator) in 19
ways. Hence the favourable number of cases for are . The required
probability is given by :

(ii) Let us define the event :

The rumour is told 10 times without being repeated to any person.

In this case the first person (narrator) can tell the rumour to any one of the
available persons; the second person can tell the rumour to any one
of the remaining persons; the third person can tell the rumour to
anyone of the remaining persons; …; the 10th person can tell the
rumour to any one of the remaining persons.

Hence the favourable number of cases for are .

Required probability

Example 11: If 10 men, among whom are and , stand in a row, what is the
probability that there will be exactly 3 men between and ?

Solution: If 10 men stand in a row, then can occupy any one of the 10 positions
and can occupy any one of the remaining 9 positions. Hence, the exhaustive
number of cases for the positions of two men and are .

The cases favourable to the event that there are exactly 3 men between and
are given below:

45
Probability Probability of an Event
st th
(i) is in the 1 position and is in the 5 position.
(ii) is in the 2nd position and is in the 6th position.

(vi) is in the 6th position and is in the 10th position.

Further, since and can interchange their positions, the total number of
favourable cases .

Required probability

Example12: A five digit number is formed by the digits 0, 1, 2, 3, 4 (without


repetition). Find the probability that the number formed is divisible by 4.

Solution: The total number of ways in which the five digits 0, 1, 2, 3, 4 can be
arranged among themselves is . Out of these, the number of arrangements
which begin with 0 (and therefore will give only 4-digited numbers) is .

Hence the total number of five digited numbers that can be formed from digits 0,
1, 2, 3, 4 is

The number formed will be divisible by 4 if the number formed by the two digits
on extreme right (i.e., the digits in the unit and tens places) is divisible by 4. Such
numbers are:

and

If the numbers end in 04, the remaining three digits viz., 1, 2 and 3 can be
arranged among themselves in in each case.

If the numbers end with 12, the remaining three digits 0, 2, 3 can be arranged in
ways. Out of these we shall reject those numbers which start with 0 (i.e., have
0 as the first digit). There are such cases. Hence, the number of five
digited numbers ending with 12 is :

46
Probability Probability of an Event

Similarly the number of 5 digited numbers ending with 24 and 32 each is 4. Hence
the total number of favourable cases is:

Hence, required probability

Example13: There are four hotels in a certain town. If 3 men check into hotels in
a day, what is the probability that each checks into a different hotel?

Solution: Since each man can check into any one of the four hotels in 4C1  4 ways,
the 3 men can check into 4 hotels in ways, which gives the
exhaustive number of cases.

If three men are to check into different hotels, then first man can check into any
one of the 4 hotels in 4C1  4 ways; the second man can check into any one of the
remaining 3 hotels in 3C1  3 ways; and the third man can check into any one of
the remaining two hotels in 2C1  2 ways. Hence, favourable number of cases for
each man checking into a different hotel is: 4C1  3C1  2C1  4  3  2  24

Required probability

47
Probability Theorems in Probability

48
Probability Theorems in Probability

1.4
Theorems in Probability
In this module, we shall prove some theorems which help us to evaluate the
probabilities of some complicated events in a rather simple way. In proving these
theorems, we shall follow the axiomatic approach based on the three axioms given
in axiomatic definition of probability in module 1.3 on definitions of probability.

In a problem on probability, we are required to evaluate probability of certain


statements. These statements can be expressed in terms of set notation and
whose probabilities can be evaluated using theorems in probability. Let and be
two events in . Certain statements in set notation are given in the following table.

S. No. Statement Set notation


1. At least one of the events or occurs
2. Both the events and occur
3. Neither nor occurs ∪
4. Event occurs and does not occur ∪
5. Exactly one of the events or occurs ∪

6. Not more than one of the events or ∪


occurs ∪
7. If event occurs, so does ∩
8. Events and are mutually exclusive
9. Complement of event
10. Sample space

49
Probability Theorems in Probability
Example 1: Let and ̅ are three events in . Find expression for the events in
set notation.

(i) only occurs (ii) both and , but not ̅ , occur


(iii) all three events occur (iv) at least one occurs
(v) at least two occur (vi) one and no more occurs
(Vii) two and no more occur (viii) none occurs

Solution:

(i) ∪ (ii)
(iii) (iv)
(v) ∪

(vi) ∪ ∪

(vii) ∪

(viii) ∪ ∪∪∪∪∪∪∪∪∪∪∪∪∪

Theorems on Probability

Theorem 1: Probability of the impossible event is zero, i.e., .

Proof: We know that ( (

( ( ( (Axiom 3)

Theorem 2: Probability of the complementary event ) of is given by


) .

Proof: Since and are mutually exclusive events in ,

( ( ( ( (Axioms 2 and 3)

( (

50
Probability Theorems in Probability
Corollary 1: = =

Proof: We have ( ( = ( ( ∆ , by Axiom 1)

Further, ( ∆ (by Axiom 1). Therefore, =( =

Corollary 2:

Proof: Since ( ( ( (by Axiom 2)

Theorem 3: For any two events and , we have

(i) ) (ii) )

Proof:

(i) From the Venn diagram, we have,


,
where and are mutually exclusive events. Hence by
Axiom 3,
( ( (
)

(ii) Similarly, we have,


∪ ,
where and ∪ are mutually exclusive events. Hence by
Axiom 3
( ( ( ∪

51
Probability Theorems in Probability
)

Theorem 4: If ∩ , then

(i) ) (ii) =

Proof:

(i) If ∩ , then and ∪ are mutually exclusive events and

( ( ( ∪ (Axiom 3)

( ∪ ( (

(ii) We have ( ∪ ∆ (Axiom 1). Hence ( ( ∆ ( =


( .
Thus, ∩ ( =( .

Theorem 5: Addition Theorem of Probability for Two Events:


Let and be any two events in . Then

Proof: From Venn diagram, we have

where and are mutually exclusive events in .

52
Probability Theorems in Probability

( ( ( (Axiom 3)

( ( ( (From Theorem 3)

Thus, ( ( ( ( .

Note:

1. If and are mutually exclusive events then and hence


( ( . Thus, if and are mutually exclusive events, then
( ( ( .
2. The addition theorem of probability for three events is given by

̅ ̅ ̅ ̅

This can be proved first by taking as one event and as second event and
repeated application of Theorem

( ( ( ( (

( ( (

( ( ( ( ( ( (

( ( ( ( ( ( (

3. Addition Theorem of Probability for -Events

Let be events in . Then

 n  n n n n n n
 n 
 P  A∩i A j +  ∩ ∩ n-1
P   Ai  = P Ai  -   P  i j k
A A A - ... +  -1 P A i 
 i=1  i=1 i=1 j=1 i=1 j=1 k=1  i=1 
i< j i< j<k

53
Probability Theorems in Probability
Example 2: If two dice are thrown, what is the probability that the sum is
(i) greater than 8, (ii) neither 7 nor 11 (iii) an even number on the first die or a
total of 8?

Solution:

(i) If two dice are thrown, then . Let be the event getting the
sum of the numbers greater than on the two dice. Then
, where and repectively the events of getting the
sum of and ⊂. Note that and are pair wise mutually
exclusive events. Therefore
( ( ( ( (
Note that and (
and (
and (
and (
(

(ii) Let denote the event of getting the sum of 7 and denote the event of
getting the sum of 11.Then
⊂ ⊂ and (
and (
Required probability ( (neither 7 nor 11)
( ∪ (
,( ( ( and are mutually exclusive events)

(iii) Let be the event of getting an even number on the first die and be the
event of getting the sum of . Therefore,

⊂ ⊂ ,
⊂ ⊂

54
Probability Theorems in Probability
⊂ ⊂ and
( ( ( (

Example 3: A card is drawn from a pack of 52 cards. Find the probability of


getting a king or a heart or a red card.

Solution: Let us define the following events:

The card drawn is a king

The card drawn is a heart

The card drawn is a red card

Then, and are not mutually exclusive.

⊂ ⊂,
, .

( ( ( ( ( ( (
⊂ ⊂ ⊂
⊂ ⊂ ⊂ ⊂ ⊂ ⊂ ⊂ ⊂

Compound event: The simultaneous occurrence of two or more events is termed


as compound event.

Compound probability: The probability of a compound event is known as


compound probability.

Conditional probability: The probability of an event occuring when it is known


that some event has occurred, is called a conditional probability of the event ,
given that has occurred and denoted by .

Definition: The conditional probability of the event , given that has occurred,
denoted by ( , is defined by

55
Probability Theorems in Probability

( if (

If ( ,( is not defined.

Example 4: Consider a family with two children. Assume that each child is likely
to be a boy as it is to be a girl. What is the conditional probability that both
children are boys, given that (i) the older child is a boy (ii) at least one of the
child is a boy?

Solution: We have the sample space .Define the


events:

Older child is a boy

Younger child is a boy

Therefore, ( ,

Then both are boys, and (

At least one is a boy

and ( ( ( (

(i) (

,
(ii) (

Independent events: Two events and are said to be independent if the


happening or non-happening of is not affected by the happening or non-
happening of . Thus, and are independent if and only if the conditional
probability of the event given that has happened is equal to the probability of
. That is,

( ( if (

56
Probability Theorems in Probability
Similarly ( ( if (

By the definition of conditional probability, we have

(
(
(

Thus, and are independent events, if and only if

( ( ( ( (

In general, are independent events, if and only if

( ( ( (

Pair wise Independent Events: A set of events are said to be pairwise


independent if every pair of different events are independent.

That is, ( ( ( for all and , .

Mutual Independent Events: A set of events are said to be mutually


independent, if ( ( ( ( for every
subset of .

Note: Pair wise independence does not imply mutual independence.

Theorem 6: Multiplication Theorem for Two events

Let and be any two events, then

P  A  .P B | A  if P  A  > 0

P  A  B  = P B  .P  A | B  if P  B  > 0

P  A  .P B  if A and B are independent

The proof follows from definition of conditional probability.

57
Probability Theorems in Probability
Note: Multiplication Theorem for -Events

P  A1  .P  A 2 | A1  .P  A 3 |  A∩ ∩ ∩ ∩
1 A 2   ... P  A n |  A1 A 2 ... A n-1  
P  A∩
1 A∩2

... A n  = 
P  A1  .P  A 2  .P  A 3  ...P  A n  ,if A1, A 2 ,..., A n are independent

Theorem 7: If and are independent events, then and ∪∪∪∪are also


independent.

Proof: (See P3)

Theorem 8: If and are independent events, then ∪∪∪∪and ∪∪∪∪are also


independent.

Proof: (See P4)

Example 5: A fair dice is thrown twice. Let and ̅ denote the following
events:

First toss is odd; Second toss is even; ̅ Sum of numbers is 7

(i) Find and ̅ .


(ii) Show that and ̅ are pair wise independent
(iii) Show that and ̅ are not independent

Solution:

(i) The number of outcomes in the sample space is given by .


We have,

⊂ and

⊂ ⊂ and

⊂ ⊂ and

Therefore, ( ( and ( .

58
Probability Theorems in Probability
(ii) ⊂ ⊂ ⊂

and (

But ( (

Thus, ( ( ( and are independent.

Next consider ⊂

and ( .

But ( ( .

Thus, ( ( ( and are independent

(iii) Consider ⊂

and (

But ( ( (

Thus, ( ( ( (

and are not independent.

Theorem 9: If and are independent events, then


∪∪∪∪ ∪∪∪∪

Proof: Consider ( ∪∪∪ ( ∪∪∪

( (

( ( ( (

( ( ( (

( ( (

(
59
Probability Theorems in Probability

Thus, ( ( ∪∪∪ ( ∪∪∪ .

Generalization: If are independent events, then


∪∪∪∪ ∪∪∪∪ ∪∪∪∪

Example 6: A problem in probability is given to three students and ̅ whose


chances of solving it are and respectively. Find the probability that the
problem will be solved if they all try independently.

Solution: Let and denote the events that the problem is solved by
and respectively. Then, we have

( ( ∪∪∪

( ( ∪∪∪

( ( ∪∪∪

The problem is solved if atleast one of them is able to solve it.

Thus, ( ( ∪∪∪ ( ∪∪∪ ( ∪∪∪

60
Probability Theorems in Probability

1.4. Theorems in Probability


Exercises:
1. A card is drawn from a well shuffled pack of 52 cards. Find the probability that
it is either a diamond or a king.
2. If ( ) = 0.4 , ( ) = 0. and (at least one of and )= 0.7, find (only
one of and ).
3. Let and be the two possible outcomes of an experiment and suppose
( ) = 0.4, ( ) = 0. and ( ) = .
(i) For what choice of are and mutually exclusive?
(ii) For what choice of are and independent?
4. An urn contains four tickets marked with numbers 112, 121, 211 and 222 and
one ticket is drawn at random. Let (8 = , , ) be the event that 8th digit of
the number of the ticket drawn is 1. Are , , (i) pairwise independent
(ii) independent?
5. An engineer applies for a job in two firms and . He estimates that the
probability of his being selected in firm is 0.7 and being rejected at is 0.5
and the probability of at least one of his applications being rejected is 0.6.
What is the probability that he will be selected in one of the firms?
6. Probability that a man will be alive 25 years hence is 0.3 and the probability
that his wife will be alive 25 years hence is 0.4. Find the probability that 25
years hence
(i) both will be alive
(ii) one the man will be alive
(iii) only the woman will be alive
(iv) none will be alive
(v) at least one of them will be alive
7. The probability that a contractor will get a plumbing contract is and the
probability that he will not get an electric contract is . If the probability of

getting at least one contract is , what is the probability that he will get both
the contracts?

61
Probability Theorems in Probability
8. A problem in probability is given to two students and . The odds in favour
of solving the problem are 6 to 9 and against solving the problem are 12 to
10. If both and attempt, find the probability of the problem being solved.
9. A piece of equipment will function only when all the three components ,
and are working. The probability of failing during one year is 0.15, that of
failing is 0.05 and that of failing is 0.10. What is the probability that the
equipment will fail before the end of the year?
10. Find the probability of throwing 6 at least once in six throws with a single die.
11. The odds that speaks the truth are 3 : 2 and the odds that speaks the truth
are 5 : 3. In what percentage of cases are they likely to contradict each other
on an identical point?
12. Three groups of children contain respectively 3 girls and 1 boy; 2 girls and 2
boys; 1 girl and 3 boys. One child is selected at random from each group. Find
the probability that the selected consist of 1 girl and 2 boys.
13. If ( ) = , ( ) = and ( ) = , then find
(i) ( )
(ii) ( )
14. If , and are mutually exclusive and exhaustive event such that ( ) =
( ) and ( ) = ( ), find ( ), ( ) and ( ).
15. If ( ) = 0. , ( ) = 0. and ( ) = 0. and , , are independent
events, find the probability of occurrence of atleast one of the three events
, and .

Answers:

1.
2. 0.
3. (i) = 0. (ii) = 0.
4. (i) yes (ii) no
5. 0.7
6. (i) 0. (ii) 0. 7 (iii) 0. 7 (iv) 0.4 (v) 0. 7
7.

62
Probability Theorems in Probability

8.
9. 0.
10.

11.
12.
13. (i) 0. (ii) 0.7
14. , ,
15. 0.4 1

63
Probability Bayes’ Rule

64
Probability Bayes’ Rule

1.5
Bayes‘Theorem and Its Applications
One of the important applications of the conditional probability is in the
computation of unknown probabilities on the basis of the information supplied by
the experiment or past records. For example, suppose an event has occurred
through one of the various mutually exclusive events or reasons. Then the
conditional probability that it has occurred due to a particular event or reason is
called it as inverse or posteriori probability. These probabilities are computed by
Bayes’ theorem, named so after the British mathematician Thomas Bayes who
propounded it in . The revision of old (given) probabilities in the list of the
additional information supplied by the experiment or past records is of extreme
help in arriving at valid decisions in the face of uncertainty.

Bayes’ Theorem (Rule for the Inverse Probability)

Let be be mutually exclusive and exhaustive events in the sample


space with for . Let be an arbitrary event which is a
n
subset of Ei such that . Then
i 1

P  Ei  .P  A | Ei  P  Ei  .P  A | Ei 
P  Ei | A  
P(A) P  Ei 
n for
 P  Ei  .P  A | Ei 
i 1

n
 n  n
Proof: Since A  Ei , we have A  A   Ei    A  Ei  .
i 1  i 1  i 1

Since are mutually exclusive events, we have by


addition theorem of probability

 n 
P  A  P   A  Ei  
 i 1 
n
  P  A  Ei 
i 1

65
Probability Bayes’ Rule
n
 P  A   P  Ei  .P  A | Ei  (By multiplication theorem of probability)
i 1

Also we have

P  Ei  .P  A | Ei 
 P  Ei | A   n
for
 P  Ei  .P  A | Ei 
i 1

which is the Bayes’ rule.

Note:

1. The probabilities are known as the ‘a priori


probabilities’, because they exist before we gain any information from the
experiment itself.
2. The probabilities are called ‘likelihoods’ because they
indicate how likely the event under consideration is to occur, given each and
every a priori probability.
3. The probabilities are called ‘posteriori probabilities’
because they are determined after the results of the experiment are known.
n
4. P  A   P  Ei  .P  A | Ei  is known as total probability.
i 1

5. Bayes’ theorem is extensively used by business, management and engineering


executives in arriving at valid decisions in the face of uncertainty.

Example 1: In a bolt factory machines manufacutre respectively


and of the total. Of their output percent are known to
be defective bolts. A bolt is drawn at random from the product and is found to
be defective. What are the probabilities that it was manufactured by

(i) Machine .
(ii) Machine or

66
Probability Bayes’ Rule

Solution: Let and denote respectively the events that the bolt selected
at random is manufactured by the machines and respectively and let
denote the event that it is defective. Then we have:

Total

(i) Hence, the probability that a defective bolt chosen at random is


manufactured by factory is given by Bayes’ rule as:

(ii) Similarly,

Hence, the probability that a defective bolt chosen at random is manufactured by


machine or is:

(OR Required probability is equal to )

67
Probability Bayes’ Rule

Aliter: TREE DIAGRAM

From the above diagram the probability that a defective bolt is manufactured by
factory is

Similarly, and

Hence, the probability that a defective bolt chosen at random is manufactured by


machine or is:

(OR Required probability is equal to )

Remark: Since is greatest, on the basis of ‘a priori’ probabilities alone, we


are likely to conclude that a defective bolt drawn at random from the product is
manufactured by machine . After using the additional information we obtained
the posterior probabilities which give as maximum. Thus, we shall now
say that it is probable that the defective bolt has been manufactured by
machine , a result which is different from the earlier conclusion. However, latter
conclusion is a much valid conclusion as it is based on the entire information at
our disposal. Thus, Bayes’s rule provides a very powerful tool in improving the
quality of probability and this helps the management executives in arriving at
68
Probability Bayes’ Rule

valid decisions in the face of uncertainty. Thus, the additional information reduces
the importance of the prior probabilities. The only requirement for the use of
Bayesian rule is that all the hypotheses under consideration must be valid and
that none is assigned ‘a prior’ probability or .

Example 2: In a railway reservation office, two clerks are engaged in checking


reservation forms. On an average, the first clerk checks of the forms,
while the second does the remaining. The first clerk has an error rate of
and second has an error rate of reservation form is selected at random
from the total number of forms checked during a day, and is found to have an
error. Find the probability that it was checked (i) by the first (ii) by the second
clerk.

Solution: Let us define the following events:

: The selected form is checked by clerk .

: The selected form is checked by clerk .

: The selected form has an error.

Then we are given:

; ;

Required to find and . By Bayes’ Rule the probability that the


form containing the error was checked by clerk , is given by;

Similarly, the probability that the form containing the error was checked by clerk
, is given by

69
Probability Bayes’ Rule

(OR )

Example 3: The results of an investigating by an expert on a fire accident in a


skyscraper are summarized below:

(i) Prob. (there could have been short circuit)


(ii) Prob. (LPG cylinder explosion)
(iii) Chance of fire accident is given a short circuit and given an LPG
explosion.

Based on these, what do you think is the most probable cause of fire?

Solution: Let us define the following events:

: Short circuit ; : LPG explosion ; : Fire accident

Then, we are given:

; ;

By Bayes’ Rule:

(OR )

Since , short circuit is the most probable cause of fire.

70
Probability Bayes’ Rule

Example 4: The contents of urns and are respectively as follows:

white, black and red balls,

white, black and red balls, and

white, black and red balls.

One urn is chosen at random and two balls drawn. They happen to be white and
red. What is the probability that they came from urns ?

Solution:

Let and denote the events of choosing 1st , 2nd and 3rd urn respectively
and let be the event that the two balls drawn from the selected urn are white
and red. Then we have:

We have:

Hence by Bayes’s rule, the probability that the two white and red balls drawn are
from 1st urn is:

71
Probability Bayes’ Rule

Similarly, we have

and ( Or

72
Probability Bayes’ Rule

1.5. Bayes‘ Theorem and Its Applications


Exercise
1. In there will be three candidates for the position of principal, in the
college, Dr. Singhal, Mr. Mehra and Dr. Chatterji, whose chances of getting
appointment are in the proportion respectvely. The probability that
Dr. singhal if selected, will abolish co – education in the college is . The
probability of Mr. Mehra and Dr. Chatterji doing the same are respectively
and . What is the probability that co – education will be abolished from the
college in ?

2. (a) Suppose that one of three men, a politician, a businessman, and an


educationist, will be appointed as the vice – chancellor of a university. The
respective probabilities of their appointments are . The
probability that research activities will be promoted by these people if they are
appointed are and respectively. What is the probability that
research will be promoted by the new vice – chancellor?
(b) A manufacturing firm purchases a certain component for its manufacturing
process from three sub – contractors and . These supply percent,
percent and percent of the firm’s requirements, respective suppliers are
defective items. On a particular day, a normal shipment arrives from each of
the three suppliers and the contents get mixed. If a component is chosen at
random from the day’s shipment, what is the probability that it is defective?

3. Assume that a factory has two machines. Past records show that machine
produces of the items of output and machine were produces of
the items. Further, of the items produced by machine were defective
and only produced by machine defective. If a defective item is drawn at
random, what is the probability that it was produced by
machine , machine ?

73
Probability Bayes’ Rule

4. In a bolt factory machines and manufacture respectively and


of the total of its output. Of them and percent respectively are
defective bolts. A bolt is drawn at random from the product and it found to be
defective. What is the probability that it was manufactured by machine ?

5. A factory produces a certain type of outputs by three types of machines. The


respective daily production figures are:
Machine : Units ; Machine : Units ; Machine : Units
Past experience shows that 1 percent of the output produced by Machine is
defective. The corresponding fraction of defectives for the other two machines
are percent and percent respectively. An item is drawn at random from
the day’s production run and is found to be defective, what is probability that
it comes from the output of
(a) Machine , (ii) Machine , (iii) Machine ?

6. Suppose that a product is produced in three factories and . It is known


that factory produces twice as many items as factory , ad that factories
and produces the same number of products. Assume that it is known that
oercent of the items produced by each of the factories and are defective
while percent of those manufactured by factory are defective. All the
items produced in three factories are stocked, and an item of product is
selected at random. What is the probability that this item is defective?

74
Probability Bayes’ Rule

7. A company has two plants to manufacture scooters. Plant manufacutes


of the scooters and Plant manufactures . At Plant , of the
scooters produced are of standard quality and at Plant , of the scooters
produced are of standard quality. A scooter is picked at random and found to
be of standard quality. What is the chance that it has come from Plant ?

8. Suppose there is a chance for a newly constructed building to collapse,


whether the design is faulty or not. The chance that the design in faulty is
. The chance that the building collapses in . If the design is faulty and
otherwise it is . It is seen that the building collapsed. What is the
probability that it is due to faulty design.

Answers:

1.
2. b.
3. , .
4.
5. a. b. c.
6. 0.07
7.
8.

75
Random Variable

UNIT-II: Random Variable

76
Random Variable Random Variable and it’s Probability Distributions

Unit – 2
Probability distributions
2.1
Random Variable
While performing a random experiment we are mainly concerned with the
assignment and computation of probabilities of events. In many experiments we
are interested in some function of the outcomes of the experiment as opposed to
the outcome itself. For instance, in tossing two dice we are interested in the sum
of faces of the dice and are not really concerned about the actual outcome. That
is, we may be interested in knowing that the sum is seven and not be concerned
over whether actual outcome was or or or or or
. These quantities of interest or more formally these real valued function
defined on the sample space are known as random variables.

Random variable (r. v): Let be the sample space associated with a random
experiment. Let be the set of real numbers. If ., is a real valued
function defined on the sample space, then is known as a random variable. In
other words, random variable is a function which takes real values which are
determined by the outcomes in the sample space.
The random variables are denoted by capital letters etc.

Notation: Let . The set of all in such that is denoted by


. That is, denotes the event . Similarly
denotes the event and denotes the event
.

77
Random Variable Random Variable and it’s Probability Distributions

Let us consider a random experiment of three tosses of a coin. Then the sample
space consits of points as given below.

For each outcome in define as the number of heads in the outcome .


Then may take any one of the values or . For each outcome in , we
have one value of . Thus,

and

This shows that is a random variable.

Note that and respectively denote the events

, and

Discrete Random Variable (d. r. v): If the random variable assumes only a finite or
countably infinite set of values, it is known as discrete random variable.
For example, the number of students attending the class, the number of
defectives in a lot consisting of manufactured items and the number of accidents
taking place on a busy road, etc., are all discrete random variables. In the above
example is a d.r.v.

Continuous Random Variable (c. r. v): If a random variable can assume


uncountable set of values, it is said to a continuous random variable.
For example, the age, height or weight of the students in a class is all continuous
random variables. In case of continuous random variable, we usually talk of the
value in a particular interval and not at a point. Generally, discrete random

78
Random Variable Random Variable and it’s Probability Distributions

variable represents count data while continuous random variable represent


measured data.

The probabilistic behavior of a d.r.v.X at each real point is described by a function


called probability mass function and it is defined below:

Probability Mass Function (p.m.f): Let be a discrete random variable with


distinct values .The function defined as

is called the probability mass function of r.v.X, if (i) and


(ii)  p  x   1
xR

Probability Distribution: The set of all possible ordered pairs


is called the probability distribution of the r.v.X.

In particular, if takes the values then the probability of is


usually represented in a tabular form as given below:

Probability Distribution of r. v. x

Note: The concept of probability distribution is analogous to that of frequency


distribution. Just as frequency distribution tells us how the total frequency is
distributed among different values (or classes) of the variable, similarly a
probability distribution tells us how total probability is distributed among the
various values which the r. v. can take.

Example 1: Obtain the probability distribution of , the number of heads in


three tosses of a coin (or a simultaneous toss of three coins).

Solution:

79
Random Variable Random Variable and it’s Probability Distributions

The sample space consists of sample points, as give below:

Obviously, is a random variable which can take the values 0, 1, 2 or 3.

The probability distribution of is computed as given below.

No. of heads No. of favourable


cases

Hence, the probability distribution of is given by:

Probability Density Function (p. d. f): Let be a continuous random variable


defined on the sample space Let be a real valued function defined on
b
such that, for any real numbers and ,  f  x dx.
a


If the function satisfies (i) and (ii)  f  x dx  1 then

is known as probability density function (p.d.f) of

80
Random Variable Random Variable and it’s Probability Distributions

Note:

1. If is a c. r. v., then where is some real number.


2. Unlike discrete probability distribution, a continuous probability distribution
can’t be presented in a tabular form.

Cumulative Distribution Function (c. d. f): The cumulative distribution function


of a r. v. is defined by

  p  t  if X is a d.r.v.with p.m. f p  x 
t  x
  x
  f  t  dt if Xis a c.r.v.with p.d. f f  x 



Note: If is a continuous random variable, then

Properties of c.d.f.

1. If
2. and if
3. and
4. Discontinuities of are atmost countable.

Note: The c.d.f. is used to find the cumulative probabilities in a probability


distribution.

Example 2:

(i) Find the constant such that

is a p.d.f.

(ii) Compute
(iii) Find the c.d.f and use it to compute

81
Random Variable Random Variable and it’s Probability Distributions

Solution:

(i) is a p.d.f if

 f  x  dx  1

3
3  x3  1
 k  x2dx  k    1  k 
 3  9
0
 0

(ii)
2
P 1  x  2    f  x  dx
1
2
21 1  x3  7
  x2dx    
2 9  
3 27
1
 1
(iii) We have,
x
F  x   P  X  x    f  u  du

If , then . If , then
x 1x x3
F  x    f  u  du   u 2du 
 9 270
If , then
3 x 13 x 1
F  x    f  u  du   f  u  du   u 2du   du   9  0  1
0 3 90 3 9
Thus, required c.d.f is

Hence

82
Random Variable Random Variable and it’s Probability Distributions

Example 3: A die is tossed twice. Getting an odd number is termed as a success.


Find the probability distribution and c.d.f of the number of successes.

Solution: Since the cases favorable to getting an odd number in a throw of a die
are , i.e., in all.

Probability of success ; Probability of failure .

If dentoes the number of successes in two throws of a die, then is a random


variable which takes the values .

in 1st throw and in 2nd throw

and and

and .

Hence the probability distribution of is given by :

The c.d.f is given by

0 if x0

 1 if 0  x 1

F  x   4
 3 if 1 x  2
4
1 if x2

83
Random Variable Random Variable and it’s Probability Distributions

Example 4: Two cards are drawn

(a) successively with replacement


(b) simultaneously (successively without replacement),

from a well shuffled deck of 52 cards. Find the probability distribution of the
number of aces.

Solution: Let denote the number of aces obtained in a draw of two cards.
Obviously, is a random variable which can take the values or

(a) Probability of drawing an ace is


Probability of drawing a non-ace is .

Since the cards are drawn with replacement, all the draws are independent.

Ace and Ace Ace Ace

Ace and Non-ace Non-ace and Ace

Ace Non-ace Non-ace Ace

Non-ace and Non-ace

Non-ace Non-ace .

Hence, the probability distribution of is given by:

(b) If cards are drawn without replacement, then exhaustive number of cases of
drawing cards out of cards is 52C2 .

84
Random Variable Random Variable and it’s Probability Distributions

No ace Both cards are non-aces

C2 48  47 188
48
  
C2 52  51 221
52

one ace one ace and one non-ace

C1  48C1 4  48  2 32
4
  
52
C2 52  51 221

4
C2 43 1
both aces   
52
C2 52  51 221

Hence, the probability distribution of is given by :

Example 5: If is a continuous random variable with p.d.f

 kx , 0  x  1

k ,1 x  2
f  x   
 k  x  3 , 2  x  3
 0 , elsewhere

(i) Determine .
(ii) Compute

Solution:

(i) Since is the p.d.f, so we have

85
Random Variable Random Variable and it’s Probability Distributions

 f  x  dx  1

1 2 3
  f  x  dx   f  x  dx   f  x  dx  1
0 1 2
1 2 3
  kx dx   k dx   k  x  3 dx  1
0 1 2
1 3
 x2   x2 
  k  x 1  k 
2
k  3x   1
 2   2 
 0  2
k  9  
  2k  k  k    9    2  6    1
2  2  
1 9 
 k   2 1   9  2  6  1
2 2 
1
 2k  1  k 
2

(ii)
1.5 1 1.5
P  x  1.5   f  x  dx   f  x  dx   f  x  dx
 0 
1
1 1.5  x2  1 1 1
 k  x dx   k dx  k    k  x 1  k     k 
1.5
 2   2 2  2
0 1   0

86
Random Variable Random Variable and it’s Probability Distributions

2.1. Random Variables

Exercise
1. State, with reasons, if the following probability distributions are admissible or
not.
(i)
0 1 2
0.3 0.2 0.5
(ii)
–1 0 2
0.4 0.4 0.3
(iii)
0 1 2 3
0.2 0.3 0.3 0.1
(iv)
–2 –1 0 1 2
0.3 0.4 – 0.2 0.2 0.3

2. Two dice are thrown simultaneously and getting a number less than 3 on a die
is termed as a success. Obtain the probability distribution of the number of
success.

3. Obtain the probability distribution of the number of sixes in two tosses of a


die.
4. Obtain the probability distribution of number of heads of two tosses of a coin.

5. Three cards are drawn at random successively, with replacement, from a well
shuffled pack of cards, getting ‘a card of diamonds’ is termed as a success.
Obtain probability distribution of the number of successes.

6. Two cards are drawn without replacement, form a well shuffled pack of cards.
Obtain the probability distribution of the number of face cards (Jack, Queen,
King and Ace).

87
Random Variable Random Variable and it’s Probability Distributions

7. Five defective mangoes are accidently mixed with twenty good ones and by
looking at them it is not possible to difference between them. Four mangoes
are drawn at random from the lot. Find the probability distribution of , the
number of defective mangoes.

8. Two bad eggs are mixed accidentally with good ones and three are drawn
at random from the lot. Obtain the probability distribution of the number of
bad eggs drawn.

9. An urn contains red and white balls. Three balls are known at random.
Obtain the probability distribution of the number of white balls drawn.

10.Suppose that the life in hours of a certain part of radio tube is a continuous
random variable with p.d. f given by

(i) What is the probability that all of three such tubes in a given radio set will
have to be replaced during the first hours of operation?
(ii) What is the probability that none of three of the original tubes will have to
be replaced during that first hours of operation?
(iii) What is the probability that a tube will last less than hours if it is
known that the tube still functioning after hours of service.
(iv) What is the maximum number of tubes that many be inserted into a set so
that there is a probability of that after hours of services all of them
are still functioning?

88
Random Variable Random Variable and it’s Probability Distributions

Answers:

1.
(i) Yes
(ii) No, since
(iii) No, since
(iv) No, since which is not possible.

2.

3.

4.

5.

6.
2

89
Random Variable Random Variable and it’s Probability Distributions

7.

:
:

8.

9.

10.

(i)
(ii)
(iii)
(iv) approximately
90
Random Variable Joint Probability Distributions

91
Random Variable Joint Probability Distributions

2.2
Bivariate random variable
In the real life situations more than one variable effects the outcome of a random
experiment. For example, consider an electronic system consisting of two
components. Suppose the system will fail if both the components fail. The
probability distribution of the life of the system depends jointly on the probability
distributions of lives of the components. Knowing the probability distributions of
lives of the components will not provide us the enough information. What we
need is the probability distribution of the simultaneous behavior of lives of the
components. A pair of random variables is known as a bivariate random variable.
The individual random variables in the pair may be related.

Bivariate random variable: let be the sample space associated with a


random experiment. Let be the real line. If ,
, then the pair is known as a
bivariate random variable.

Note:

1. If and are both discrete random variables, then is a bivariate


discrete random variable.
2. If and are both continuous random variables, then is a bivariate
continuous random variable.

Joint probability mass function: Let be a bivariate discrete random


variable, which takes the values for and .
Let

and

 
m n
Then and and   p xi , y j  1. The function is
i1 j 1
known as joint probability mass function (j.p.m.f) of .

92
Random Variable Joint Probability Distributions

Marginal probability mass functions: Let be a bivariate discrete random


variable with joint probability mass function given by . The marginal
probability mass function of and are given by

 
n
 p xi ,y j for and
j 1

 
m
 p xi ,y j for
i1
respectively.

Note: and are independent if and only if

Conditional probability mass functions: Let be a bivariate discrete random


variable with joint probability mass function given by . The conditional
probability mass function of given and the conditional probability mass
function of given are given by

for and

respectively.

Example 1: A fair coin is tossed three times. Let be a random variable that
takes the value if the first toss is a tail and the value if the first toss is a head
and be a random variable that defines the total number of heads in the three
tosses. Then

i. Determine the joint, marginal and conditional mass functions of


and .
ii. Are and independent?

93
Random Variable Joint Probability Distributions

Solution:

i. The sample space and values of and are given in the following table:

Out comes in sample space Value of Value of

Here takes the values and and takes the values and . Then the
j.p.m.f of is computed as below:

The m.p.m.f of is given by

and
94
Random Variable Joint Probability Distributions

The m.p.m.f of is given by

The conditional p.m.f of given is computed below:

The conditional p.m.f. of given is computed as below:

95
Random Variable Joint Probability Distributions

ii. Here , and


Since , and are not independent .

Example 2 : The j.p.m.f. of is given by

k  2 x  y  for x = 1, 2; y = 1, 2
p  x, y   

0 otherwise

where is a constant

a. Find the value of


b. Find marginal and conditional p.m.fs.
c. Are and independent.

Solution:
2 2
a. Since is a j.p.m.f,   p  x, y   1
x1 y1

2 2 2 2
  p  x, y   k    2 x  y   k  3  4  5  6   18k  1.
x1 y1 x1 y1

Thus

b. The m.p.m.f. of is given by


2 1 2 1 4x  3
p1  x    p  x, y    2 x  y   2 x  1   2 x  2   
y1 18 y1 18 18

Thus, for

The m.p.m.f of is given by

96
Random Variable Joint Probability Distributions
2 1 2 1 2y  6 y 3
p  y    p  x, y    2 x  y   2  y    4  y   
2 x1 18 x1 18 18 9

Thus, for

The c.p.m.f. of given is given by

Thus, for

The c.p.m.f. of given is given by

Thus, for

c. Note that .
Thus, and are not independent.

Joint probability density function: Let be a bivariate continuous random


variable. Let
bd
P  a  x  b,c  y  d     f  x, y dx dy
ac

97
Random Variable Joint Probability Distributions

for some real numbers such that and . Then

i. and
 
ii.   f  x, y dx dy  1
 

and the function is known as the joint probability density function of the
bivariate continuous random variable .

Marginal probability density function: Let be a bivariate continuous


random variable with j.p.d.f. . The marginal probability density functions of
and are given by
 
 f  x, y dy and  f  x, y dx
 

respectively.

Note: and are independent if and only if

Conditional probability density functions: Let be a bivariate continuous


random variable with j.p.d.f. . Let and be the m.p.d.fs of and
respectively.The conditional probability density function of given and the
conditional probability density function of given are given by

and

respectively.

Cumulative distribution function: The cumulative distribution of a bivariate


random variable is defined by

and

98
Random Variable Joint Probability Distributions

   p  t,s  if  X ,Y  is a d.r.v with j.p.m. f . p  x, y 


t  x s y
F  x, y     
   f  t,s dt ds if  X ,Y  is a c.r.v with j.p.d. f . f  x, y 
 

Properties of cumulative distribution function

1.
2.
3. and

4.

Marginal cumulative distribution function: Let be a bivariate random


variable with c.d.f. .The marginal cumulative distribution functions of
and are given by and respectively.

Note:

If is a bivariate continuous random variable with c.d.f. , then its


j.p.d.f. is given by

Example 3: The j.p.d.f. of is given by

  x y 
f  x, y   e for 0  x  ,0  y  
0 otherwise

a. Find marginal p.d.fs of and .


b. Are and independent?

Solution:

a. The m.p.d.f of is given by


   x y  
f1  x    f  x, y dy   e  dy  e x  e y dy  e x .1  e x
0 0 0
99
Random Variable Joint Probability Distributions

e x for x  0
 
0 otherwise

The m.p.d.f. of is given by


  
 x y 
f 2  y    f  x, y dx   e  dx  e y  e xdx  e y .1  e y
0 0 0

e y for y  0
 
0 otherwise

b. Since and are independent.

Example 4: The j.p.d.f. of is given by

  x y1
f  x, y    xe 0  x   ,0  y  

0 otherwise

a. Determine the marginal and conditional p.d.fs


b. Are and independent.

Solution: The m.p.d.f of is given by


y 
   x y1  
   e xy 
f1  x    f  x, y dy   xe x
dy  xe  e xy dy  xe x   e x
0 0 0 x
  y 0
 f1  x   e x for 0  x  

The m.p.d.f. of is given by



  x y1 
 x y1
e  dx
    x y 1
dx     1
x.e
f 2  y    f  x, y dx   xe  
y 1 y 1 0
0 0  
0

(using integration by parts)

100
Random Variable Joint Probability Distributions

1   x y1  1
 0  
2 
e
 y  1  0  y  12

1
 f2  y   for 0  y  
 y 12
The conditional p.d.f of given is given by

for

The conditional p.d.f of given is given by

for

Note that . Hence, and are not


independent .

101
Random Variable Joint Probability Distributions

Example 5: The j.p.d.f of is given by for


.

a. Find
b. Find the m.p.d.fs of and
c. Are and independent.

Solution:
21 21 2 1  2 1
a. We have   f  x, y dx dy    kx3 y dx dy  k  x3   y dy dx  k  x3 dx
00 00 0 0  0 2

2
k  x4  k
  16  2k
2 4  8
 0

21 1
Now ,   f  x, y dx dy  1  2k  1  k  2
00

The j.p.d.f of is given by

for

The m.p.d.f of is given by

1
1 1 31 1 3  y 2  1
 f  x, y dy  2 x  y dy  2 x 2  x3
  4
0 0  0

for

The m.p.d.f. of is given by

2
2 1 2 3 1  x4 
 f  x, y dx  2 y  x dx  2 y 4  2y
 
0 0  0
for

102
Random Variable Joint Probability Distributions

b. Note that

Since , and are independent.

103
Random Variable Joint Probability Distributions

2.2. Bivariate Random Variable:

Exercise
1. The joint probability mass function of is given in the following table:

Find (i) , (ii) , (iii) , iv) .

2. The j.p.m.f. of is given by: ,


and .
Find (i) m.p.m.fs of and and (ii) conditional p.m.f. of given .

3. The j.p.m.f. of is given by


for and
Find the m.p.m.fs of and .

4. The j.p.m.f. of is given by


for and
Find the conditional p.m.f of for given .

5. The j.p.d.f. of is given by

Find i) m.p.d.fs of and . ii) Conditional p.d.f. of given .

104
Random Variable Joint Probability Distributions

6. The j.p.d.f. of is given by

i) Find the m.p.d.fs of and .


ii) Are and independetn?

7. The j.p.d.f. of is given by

Find m.p.d.fs of and .

8. The j.p.d.f. of is given by

Find i) m.p.d.fs of and . ii) c.p.d.fs of and .

105
Random Variable Joint Probability Distributions

Answers:

1. (i) (ii) (iii) (iv)

2. i)

ii)

3.

4.

5. (i) for , for .

(ii) for .

106
Random Variable Joint Probability Distributions

6. (a) , .
(b) No

7. for , for .

8. and

and

107
Mathematical Expectation

UNIT-III: Mathematical Expectation

108
Mathematical Expectation Expectation of variables and it’s Properties

2.3
3.1
Mathematical Expectation
The term expectation is used for the process of averaging when a random variable
is involved. It is the number used to locate the centre of the probability
distribution (p.m.f or p.d.f) of a random variable. A probability distribution is
described by certain satisfied measures which are computed using mathematical
expectation (or expectation)

Let be a random variable defined on a sample space . Let be a function of


such that is a random variable. Then the expected value of is
defined by

 g  x  p  x  if X is a d.r.v with p.m. f . p  x 


x
E  g  X      --------
  g  x  f  x  dx if X is a c.r.v with p.d. f . f  x 


provided these values exist.

Mean and moments:

i. Let . Then, by formula , expected value of is defined by


 x p  x  if X is a d.r.v with p.m. f . p  x 
x
E  X      
  x f  x  dx if X is a c.r.v with p.d. f . f  x 


Then is called the mean of the random variable and it is denoted by

ii. Let where is an arbitrary constant and is a non negative


integer. Then the formula gives
  x  Ar p  x  if X is a d.r.v with p.m. f . p  x 
x
E  X  A  'r   
r
   x  Ar f  x  dx if X is a c.r.v with p.d. f . f  x 


109
Mathematical Expectation Expectation of variables and it’s Properties

The quantity is called the moment about and it is denoted by


If then are known as Raw Moments.

iii. Let Then the formula gives


  x   r p  x  if X is a d.r.v with p.m. f . p  x 
x
E  X  r  r   
   x   r f  x  dx if X is a c.r.v with p.d. f . f  x 


The function is called the central moment of and it is denoted


by

iv. If then and it is known as the variance of the


random variable and it is denoted by or .
v. Mean and variance are important statistical measures of a
probability distribution.

Example 1: Let be a d.r.v with the p.m.f. given below:

Find and .

Solution:

110
Mathematical Expectation Expectation of variables and it’s Properties

Example 2: Find the expectation of the number on a die when thrown.

Solution: Let be the random variable representing the number on a die when
thrown. Then can take any one of the values each with equal
probability . Hence

Example 3: Two unbiased dice are thrown. Find the expected values of the sum
of numbers of points on them.

Solution: Define is the sum of the numbers obtained on the two dice and
and its probability distribution is given by

Example 4: In four tosses of a coin, let be the number of heads. Find the mean
and variance of .

111
Mathematical Expectation Expectation of variables and it’s Properties

Solution: The sample space consists of outcomes and the following


table gives the outcomes and the value of for each outcome is

S.No Out come

The p.m.f of is given in the following table:

112
Mathematical Expectation Expectation of variables and it’s Properties

Example 5: Find the mean and variance of the random variable , whose p.d.f is
given by

1
, 0 x2
f  x    2
0 , otherwise

Solution:

2
2 12 1  x2 
 x. f  x dx  2  x.dx  2 2  1 0  1
 
0 0  0

Mean of the random variable is .

Variance
 3 2
2 2 12 2 1   x 1  1 2 1
  x 1 . f  x dx  2   x 1 .dx  2  3   2  3  3
0 0  0

Variance of the random variable is

Example 6: Find the mean of the random variable whose p.d.f. is given by

e x , x  0
f  x  
0 , otherwise

113
Mathematical Expectation Expectation of variables and it’s Properties

Solution:
 
  
1 2 x
    x   e xdx  0  e x   0  1  1
 x f x dx
2 0
xe dx 
xe 0   0
0 0

Theorems on Mathematical Expectation:

The following theorems are proved by assuming that the random variables are
continuous. If the random variables are discrete, the proof remains the same
except replacing integration by summation.

Theorem 1: If is a random variable and and are constants then

Proof: Let be a c.r.v with p.d.f. Then

     
  ax  b  f  x  .dx  a  x f  x  dx  b  f  x  .dx  a E  X   b   f  x  .dx  1
     

Corollary 1: If , then

Corollary 2: If and then

Theorem 2: Addition Theorem of mathematical expectation.

If and are random variables, then provided all the


expectations exist.

Proof: Let and be continuous random variables with j.p.d.f. and


m.p.d.fs be and respectively. Then by definition,
 
 x f1  x  .dx and  y. f 2  y  .dy
 

 
Now,    x  y  f  x, y  .dx.dy
 

114
Mathematical Expectation Expectation of variables and it’s Properties
   
   x f  x, y  dxdy    y f  x, y  dxdy
   

     
  x   f  x, y  dy  dx   y   f  x, y  dx  dy
       

 
  x f1  x  dx   y f 2  y  dy  E  X   E Y 
 

Generalization: If are random variables, then


provided all
the expectations exist.

Theorem 3: Multiplication Theorem of mathematical Expectations

If and are independent random variables, then

Proof: Let and be continuous random variables with j.p.d.f. and


m.p.d.fs be and respectively. Then by definition,
 
 x f1  x  dx and  y f 2  y  dy
 

 
Now,    xy  f  x, y  dx dy
 

 
    x, y  f1  x  f 2  y  dx dy  X and Y are independent 
 

   
   x f1  x  dx 
 
y f 2  y dy   E  X  E Y 
    

Generalization: If are independent random variables, then


.

115
Mathematical Expectation Expectation of variables and it’s Properties

Theorem 4: Mathematical expectation of a linear combination of random


variables.

Let be any random variables and be any


constants. Then

provided all the expectations exist.

The proof follows using Theorem and generalization of Theorem .

Theorem 5:

Proof:

is a constant and

Note: The formula is simple to use instead of

Theorem 6: If is a random variable, and and are constants, then

Proof: Let . Then and

116
Mathematical Expectation Expectation of variables and it’s Properties

Corollary 1: If then variance of a constant is zero.

Corollary 2: If , then

Covariance: If and are two random variables, then the covariance between
them is defined by

Cov

Note:

1. If and are independent , then Cov


2. Cov where and are constants.
3. Cov .

Theorem 7: Variance of a linear combination of random variables.

Let be any random variables and are


constants, then

Proof:
n n n
Let U   ai X i , then  ai E  X i  and  ai  X  E  X i  
i1 i1 i1

117
Mathematical Expectation Expectation of variables and it’s Properties

  
n n n
 ai 2  X i  E  X i    2   ai a j  X i  E  X i   X j  E X j
2
i1 i1 j 1
i j

  
n
 ai 2 E  X i  E  X i    2  ai a j E  X i  E  X i   X j  E X j 
2 n n

i1 i 1 j 1
 
i j

 n 
 
n n n
V   ai X i   V U    ai 2V  X i   2   ai a j cov X i ,X j
 i1  i1 i1 j 1
i j

Note:

 n  n
1. If are independent , then V   ai X i    ai 2V  X i 
 i1  i1
2. If and , then

3. If and , then

4. If and are independent , then

Example 7: The j.p.d.f. of and is given by

2  x  y ,0  x  1,0  y  1
f  x, y   
0 , otherwise
Find

i. m.p.d.fs of and
ii. c.p.d.fs of and
iii. and

118
Mathematical Expectation Expectation of variables and it’s Properties

iv. Covariance between and

Solutions:
1
1 1  y2  1 3
i. f1  x    f  x, y  dy    2  x  y  dy  2 y  xy    2  x    x
0 0  2 
0
2 2

3
x ,0  x  1
f1  x    2
0 , otherwise

3
 y ,0  y  1
Similarly f 2  y    2
0 , otherwise

ii. ,

and ,

1 1
3  5
iii.  xf1  x  dx   x  2  x  dx  12 and
0 0  
1 1
23  1
 x f1  x  dx   x  2  x  dx  4
2
0 0  

Thus

Similarly

11 1 1 1
iv.  xy f  x, y  dxdy    xy  2  x  y  dxdy  6 (verify!)
00 0 0

119
Mathematical Expectation Expectation of variables and it’s Properties

3.1. Mathematical expectation


2.3.
Exercise:
1. Find the mean and variance of the following probability distribution:

2. The probability distribution of a random variable is given below:

Find (a) (b) (c)

3. A fair coin is tossed three times. Let denote the number of tail appearing.
Find the mean and variance of .
4. The j.p.m.f of is given in the following table:
X
Y

Find (a) (b) (c) (d) (e)

5. A discrete random variable can take all possible integer values from to ,
each with a probability . Find its mean and variance.

120
Mathematical Expectation Expectation of variables and it’s Properties

6. The j.p.d.f of is given by:


  x
   y   y
f  x, y   
e e
, 0  x  , y  0
 y
0 ,otherwise

Find
(Hint: Find the c.p.d.f. of given and hence find its mean.

7. The j.p.m.f. of is given by


 5 y , 0  x  0.2 , y  0
f  x, y   25e
0 , otherwise
a. Find the m.p.d.fs of and .
b. Cov
8. The j.p.m.f. of is given by
1
p  x, y   18
 2 x  y  ,x  1,2 and y  1,2
0 ,otherwise

Find the conditional p.m.f. of
a. given b. given

121
Mathematical Expectation Expectation of variables and it’s Properties

Answers:
1. and
2.
3. Mean and variance
4.
5. mean and variance

6. and

7.

8. for and

for and

122
Mathematical Expectation Correlation Coefficient and its Properties

123
Mathematical Expectation Correlation Coefficient and its Properties

3.2
2.7
Correlation coefficient and Bivariate Normal Distribution
Meaning of correlation:

In a bivariate distribution we may be interested to find out if there is any correlation


or covariance between the two variables under study. If the change in one variable

two variables deviate in the same direction, . ., if the increase (or decrease) in one
affects a change in the other variable, the variables are said to be correlated. If the

be positive. But, if they constantly deviate in the opposite directions, . ., if increase


results in a corresponding increase (or decrease) in the other, correlation is said to

(or decrease) in one results in corresponding decrease (or increase) in the other,
correlation is said to be negative. For example, the correlation between (i) the
heights and weights of a group of persons, and (ii) the income and expenditure; is
positive and the correlation between (i) price and demand of a commodity and (ii)
the volume and pressure of a perfect gas; is negative. Correlation is said to be perfect
if the deviation in one variable is followed by a corresponding and proportional
deviation in the other.

Karl Pearson’s Coefficient of Correlation:

As a measure of intensity or degree of linear relationship between two variables, Karl

,
Pearson, a British Biometrician developed a formula called correlation coefficient.
Correlation coefficient between two variables and , usually denoted by
or , is a numerical measure of linear relationship between them and is defined by

, = =
,
.

where = , = − − ,

=" = − =" = −
! ! ! !
and

124
Mathematical Expectation Correlation Coefficient and its Properties

Note:

1. # , provides a measure of linear relationship between and . For non –


linear relationship, however, it is not suitable.

2. Karl Pearson’s correlation coefficient is also called product – moment correlation


coefficient.

Properties:

1. −1 ≤ # , ≤ 1. If # = −1, the correlation is perfect and negative. If # = 1,


the correlation is perfect and positive.

&= and " = , then # &, " = # ,


2. Correlation coefficient is independent of change of origin and scale. That is, if
'( '*
) +

Theorem: Two independent variables are uncorrelated.

Proof:

Consider = , = − −

⟹ = , − . …… (1)

If and are independent, then

= . .…… (2)

From 1 and 2 , if and are independent, then # , =0

The converse need not be true. That is, uncorrelated variables need not be
independent.

Example 1 : Let ~0 1, 2 and = 3


. Then 4 =4 5
= 1.

Solution: Consider , = − . = 6
− . !

= 0−0 = 0

⟹ , = 0 but and are related by = !


.

125
Mathematical Expectation Correlation Coefficient and its Properties

Thus, uncorrelated variables need not be independent.

Note: The converse is true if the joint distribution of , is bivariate normal.

Example 2: The j.p.m.f of , is given below:

−2 2

1 2 5
7 7

3 3
2
7 7

Find the correlation coefficient between and

Solution : Computation of marginal p.m.fs

−2 2 8 9

1 2 5 4
7 7
8

4
3 3
2 7 7 8

< = 6 ? 1
> >

We have
6 ? 6 ? ! C
= ∑ = < = = −1 × + 1 × = − + = = ,
> > > > > D

!6 ? 6 ?
!
= ∑ = ! E = = −1 + 1! × = + = 1, then
> > > >

! C ! C C?
" = !
− =1−F G =1− =
D CH CH

126
Mathematical Expectation Correlation Coefficient and its Properties

D D D C
=∑ I = 0× +1× = =
> > > !
Similarly,
D D D C
!
=∑ !
I = 0! × + 1! × = = and
> > > !

! C C ! C C C
" = !
− = −F G = − =
! ! ! D D

C 6 ! !
= 0 × −1 × + 0 × 1 × + 1 × −1 × + 1 × 1 × = 0
> > > >
Further,

Thus, , = −

0− × =−
C C C
D ! >

,
L
∴# , = =− =− = −0.2582
M C

K" K" √C?


LN L
K ×
LO P

Example 3: Two random variables and have the joint probability density
function
3 − T − 9 , 1 < = < 1 , 1 < < 1^
S T, 9 = U
1 , WXYZ[\]Y
Find correlation coefficient between and .

Solution: By symmetry in = and we have _C = = _! =


" ="
, and

The m.p.d.f is given by


C C
3
_C = = ` _ =, b =` 2−=− b = −=
a a 2

− = , _ 0 < = < 1^
6
Thus, _C = = d!
0 , eℎ gh i

127
Mathematical Expectation Correlation Coefficient and its Properties

Consider.
C C
3 C
3 5
= ` =_C = b= = ` = j − =k b= = ` j = − = ! k b= =
a a 2 a 2 12
C C
3 C
3 1
!
= ` =_C = = ` = ! j − =k b= = ` j = ! — = 6 k b= =
a a 2 a 2 4

Further,
C C C C
= ` ` = _ =, b= b =` ` = 2−=− b= b
a a a a

=! =6 =!
C C C C
=` m` 2= − = − = b= n b = `
!
o2. − − p b
a a a 2 3 2 a
C
1 C
2
=` j1 − − k b = ` j − k b
a 3 2 a 3 2

2 1 1 1
C
! 6 6 C
=` m − nb = o − p = − =
a 3 2 3 6 a 3 6 6

1
∴ =
6
? !
Thus, " = − = −F G = − = =
! ! C C !? 6H'!? CC
D C! D CDD CDD CDD

? !
, = − = −F G = − = =−
C C !? !D'!? C
H C! H CDD CDD CDD
and

∴ The correlation coefficient is given by

,
L L
# , = =− = − LPP
LL = −
LPP C

K" = K"
LL LL CC
K K LPP
LPP LPP

128
Mathematical Expectation Correlation Coefficient and its Properties

3.2..Correlation coefficient and Bivariate Normal Distribution


2.7
Exercise:
1. Find the correlation coefficient between and for each of the j.p.d.f.
, of , given below:
+ , 0≤ ≤1, 0≤ ≤1
(i) , =
0 , ℎ
+ , 0≤ , ≤1
, =
0 , ℎ
(ii)
2 , 0< <1, 0< <1
, =
0 , ℎ
(iii)

2. If , and are uncorrelated r.vs with 0 mean and standard deviations 5, 12


and 9 repecitvely and ! = + and " = + , then find the correlation
coefficient between ! and ".

3. If , , are uncorrelated r.vs having same variance, find the correlation


coefficient between + and + .

4. If the independent r.vs and have variance 36 and 16 respectively, find the
correlation coefficient between + and − .

129
Mathematical Expectation Correlation Coefficient and its Properties

ANSWERS
'
1. (i) −0.2055 (ii) − (iii) 0.8
''

)*
2.
+,

'
3.

,
4.
'

130
Mathematical Expectation M.G.F and its Properties

131
Mathematical Expectation M.G.F and its Properties

3.2
3.3
Moment Generating Function
Certain derivations presented in modules , and have been somewhat
heavy on algebra. For example, determining the mean and variance of the
Binomial distribution turned out to be fairly tiresome. Another example of hard
work was determining the set of probabilities associated with a sum ,
. Many of these tasks are greatly simplified by using probability
generating functions.

Moment Generating Function: The moment generating function (m.g.f) of a


random variable is dentoed by and it is defined as


tr '
M X  t    r
r 0 r!

which gives the m.g.f in terms of moments.

Therefore the coeffceint of in is , where and


, moment about origin.

The m.g.f of about mean is defined as

132
Mathematical Expectation M.G.F and its Properties

where is known as the central moment for

Note that

Since generates moments, it is called moment generating function.

If is a discrete random variable with p.m.f. then

M X  t   E  e t X    et x p  x 
x

If is a continuous random variable with p.d.f. , then



M X  t   E  etX    et x f  x  dx


Moments Using Moment Generating Function:

Differentiating equation (1) with respect to and then putting , gives

In general,

Note: Moment generating function is used to calculate the higher


moments.

Theorems on Moment Generating Function:

Theorem 1: where is a constant.

Proof: By definition

Therefore,
133
Mathematical Expectation M.G.F and its Properties

Theorem 2: The moment generating function of the sum of independent


random variables is equal to the product of their respective moment generating
functions,

Proof: By definition,

(Since are independent).

Therefore,
Hence the proof.

Uniqueness Theorem of Moment Generating Function:

The m.g.f. of a distribution, if exists, uniquely determines the distribution. This


implies that corresponding to a given probability distribution, there is only one
m.g.f (provided it exists) and corresponding to a given m.g.f, there is only one
probability distribution. Hence and are identically
distributed.

Effect of Change of Origin and Scale on Moment Generating Function:

Let a random variable be transformed to a new variable by changing both the


origin and scale in as , where and are constants.

The m.g.f of (about origin) is given by

Note: If , then

134
Mathematical Expectation M.G.F and its Properties

Example 1: If represents the outcome when a fair die is tossed, find the m.g.f.
of and hence, find and .

Solution: When a fair die is tossed

1 6 tx
 M X t    e P  X  x    e
6
tx

X 1
6 x1

Mean

Now,

Example 2: Find the m.g.f. of the random variable whose probability function
and hence find its mean.

Solution: By definition,
    et  x

M t   E  e tX
 
 e P  X  x    e  x     
tx 1tX
X
x 0 x 1  2  x1  2 

135
Mathematical Expectation M.G.F and its Properties

Therefore,

Thus, mean

Example 3: If the moments of a random variable are defined by


, .Show that , , and
.

Solution: We know that


 tr '
M X t    r
r 0 r !

where
 tr '  tr
 M X t   1   r  1   0.6 
r 1 r ! r 1 r !

But by definition,

M X  t   E  et x    et x P  X  x 
r 0

…+… …… (2)
136
Mathematical Expectation M.G.F and its Properties

From equations and , we have

Equating the coefficients of like terms on both sides,

Example 4: Find the m.g.f. of a random variable whose moments are


.
 tr '
Solution: By definition, we have M X  t    r ! r
r 0

 tr 
  r  1 ! 2r    r  1 2t 
r
r 0 r ! r 0

Example 5: If , find the m.g.f of and hence find its mean and
variance.

Solution: Since , its p.m.f. is given by

, and .

Then the m.g.f. of is given by


n n n x
M X  t   E et X    et x p  x    et x   p x q n x    n   pet  q n x
n
  x0 x0  x xx0  

Mean

137
Mathematical Expectation M.G.F and its Properties

Next,

Now, variance

Thus, and

Note that where .Thus, .

Note: For binomial distribution, mean is always greater than variance.

Example 6 : If , find its m.g.f. and hence find its mean and variance.

Solution: Since , then its p.m.f. is given by

and

The m.g.f. of is given by


 
e   x
M X  t   E e    e p  x    e
tX tx tx

x 0 x 0 x!

  et 
x


e  x!
x 0

Since ; Mean .

Now,

Then

Thus, variance

138
Mathematical Expectation M.G.F and its Properties

Therefore,

Note: Mean and variance are same for Poisson distribution.

Example 7: If , find its m.g.f. and hence find its mean and variance.

Solution: Since , its p.m.f. is given by

The m.g.f of is given by


  r  r
M X t    et x   p  q 
x
x0  x
  r  r   r 
   
x
  x p  qet x
 p r
  x  qet
x0   x0  

Now,

Mean

Further,

Then

Hence, variance

139
Mathematical Expectation M.G.F and its Properties

Example 8:Let be a random variable with p.d.f.

Find

(i)
(ii) M.g.f. of
(iii) and

Solution:
x
 1 
(i) P  X  3    f  x  dx   3
e dx
3 3 3


(ii) M X  t   E  etX    et x f  x  dx
0


  1  t  x 
 tx 1 
x
1
 1
 t 3 x 1  3 

1 
 t x 1  e  3  
  e e 3 dx   e  
dx  0 e  
dx   
0 3 3 0 3 3   1 
 t
  3  0

   
1  1  1  1 
 0    
3 1  3   1  3t  
 t 
  3     3  

(iii)

140
Mathematical Expectation M.G.F and its Properties

Example 9: Let X be a discrete random variable with p.d.f.

Show that does not exist even though m.g.f. exist.

Solution:
  
1 1 1 1 1
E  X    x p  x       ...    1
x 1 x 1 x 1 2 3 4 x 1 x

 1
But  is a divergent series.
x 1 x

Therefore, does not exist and hence, no moment exists.

Now, m.g.f. of is given by


  etx
M X t    p  x e tx
  x  x  1
x1 x 1

Substituting

zx z z2 z3
M X t        ...
x1 x  x  1 1.2 2.3 3.4

141
Mathematical Expectation M.G.F and its Properties

and does not exist for .

142
Mathematical Expectation M.G.F and its Properties

3.3. Moment Generating Function


3.2.
Exercise:
1. Find the m.g.f of a r.v. whose moments are given by

2. If , find standard deviation of

3. Find the m.g.f. of a.r.v whose p.d.f is given by

4. Find the m.g.f and hence find the mean and variance of a.r.v. whose p.d.f. is
given by
i.

ii.

iii.

iv.

v.

143
Mathematical Expectation M.G.F and its Properties

Answers:

1.

2.

3.

4.

i.

ii.

iii.

iv.

v.

144
Mathematical Expectation Chebyshev’s and Markov’s Inequality

3.4

145
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Unit – 3
Probability Inequalities and Generating Functions
3.1
3.4
Probability Inequalities
Inequalities are useful for bounding quanties that might otherwise be hard to
compute. They will also be used in the theory of convergence and limit
theorems.

Chebychev’s Inequality

When we want to find the probability of an event described by a random variable,


its c.d.f or p.d.f. or p.m.f. is required. If it is not known but its mean and variance
are known, we can use Chebychev’s inequality to find the upper bound or lower
bound for the probability of the event.

Theorem 1: If is a random variable with mean and variance , then

where

Proof: The proof is given for a continuous random variable. Let be a continuous
r.v. with p.d.f. . Then

146
Mathematical Expectation Chebyshev’s and Markov’s Inequality

In the first integral,

In the third integral,

Thus,

Note: The proof is similar as in the case of d.r.v. except that integration is
replaced by summation.

Alternative forms:

Let for . Then from , we have

and from (2), we have

147
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Example 1: If a r.v. has mean and variance and the probability


distribution is unknown, then find .

Solution: Since the probability distribution of is not known, we can’t find the
value of the required probability. We can find only a lower bound for probability
using Chebychev’s inequality. We have, for .

Given and .

Then

Let . Then

Thus, the probability of lying between and is alseast

Example 2: A d.r.v. takes the values and with probabilities and


respectively. Evaluate and compare it with the upper bound
given by Chebychev’s inequality.

Solution: We have,

Then

and

148
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Hence

Consider

On the other hand, by Chebychev’s inequality,

Note that the two values are same.

Example 3: Use Chebychev’s inequality to find how many times must a fair coin
be tossed in order that the probability that the ratio of the number of heads to
the number of tosses will lie between and will be at least .

Solution: Let denote the number of heads obtained when a fair coin is tossed
times. Then . That is and .

Let . Then

and

Since for a fair coin, and

By Chebychev’s inequality for

149
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Notice that, if then

Now, find when and

Thus,

Bienayme – Chebychev’s inequality

Theorem 3: Let be a non-negative function of a r.v. . Then for every


, we have

Proof: Here we shall prove the theorem for continuous random variable. The
proof can be adapted to the case of discrete random variable on replacing
integration by summation over the given range of the variable.

Let be the set of all for which .That is, .

Now, E  g  X   S g  x  f  x  dx , where is the p.d.f. of

 k S f  x  dx  on S , g  x   k 
 kP  g  X   k 

E  g  X 
 P  g  X   k  
k

150
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Note:

1. If , then and
replacing by in equation , we get

which is Chebychev’s inequality.

2. If in then we get for any ,

which is known as Markov’s inequality.

3. If in then we get

which is known as generalized Markov’s inequality.

Cauchy – Schwartz Inequality

When the j.p.d.f. of and is known, upper bound for expected value of the
product of and can be found by using Cauchy – Schwartz inequality when
second moments about origin of and are given ( ., and are
given).

Theorem 2: For any two random variables and

Proof: Consider for any real number . That is,

which is a quadratic expression in . This expression is always positive only when


151
Mathematical Expectation Chebyshev’s and Markov’s Inequality

has complex roots. This is possible only when discriminant of the expression is
negative. Thus,

Hence the result.

Example 4: The j.p.d.f. of is given by

for and .

Verify whether Cauchy-Schwartz inequality.

Solution: The joint and marginal p.m.fs and of and respectively


are given in the following table.

152
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Verification of Cauchy-Schwatz inequality:

Here and

Note that .

Example 5: Let and be c.r.vs with j.p.d.f.

Verify Cauchy-Schwartz inequality.

Solution: The m.p.d.f. of is given by

Since is symmetric in , the m.p.d.f. of is given by

Now,

153
Mathematical Expectation Chebyshev’s and Markov’s Inequality

Similarly, .Now,

Thus , and

Hence .

154
Mathematical Expectation Chebyshev’s and Markov’s Inequality

3.4
3.1 .Probability inequalities
Exercise:

1. The Chebychev’s inequality for random variable is ,


find and .

2. Two unbiased dice are thrown. If is the sum of the numbers showing up,
prove that . Compare this with the actual probability.

3. If is the number scored in a throw of a fair die, find the upper bound for
where . Also find the actual probability.

4. If is a r.v. such that and , find the lower bound of


.

5. A discrete random variable is specified by and .


Compute
(i) and
(ii) Chebychev’s inequality bound.

Answers:
1. and

2. Actual probability

3. Upper bound and actual probability

4.

5. (i) (ii)

155
Special Probability Distributions

UNIT-IV: Special Probability Distributions

156
Special Probability Distributions Discrete Probability Distributions

4.1
2.4
Discrete Probability Distributions
Modules and deal with general properties of random variables. Random
variables with special probability distributions are encountered in different fields
of science and engineering. Some specific discrete probability distributions are
discussed in this module and some specific continuous probability distributions
are discussed in the next module

Discrete Uniform Distribution: A r.v. is said to have a discrete uniform


distribution over the range , if its p.m.f. is given by

Notation: ,read as follows discrete uniform distribution with


parameter .

Note: If all possible values of a r.v. are equally likely, then this distribution is used.

Example 1: If an unbiased coin is tossed once and is equal to number of heads,


then and

and .

Example 2: If an unbiased die is thrown once and is equal to number on the die,
then and for and .

1 n n 1
Mean and Variance: We have E  X   
n i 1
i
2

1 n 2  n  1 2n  1
 
and E X 2  i 
n i 1 6

 n  1 n  1
 
Thus V  X   E X 2   E  X   
2

12
157
Special Probability Distributions Discrete Probability Distributions

Bernoulli Experiment: A random experiment whose outcomes are of two types,


success (S) and failure (F), occurring with probabilities and
respectively, is called a Bernoulli experiment.

Conducting a Bernoulli experiment once is known as Bernoulli trial. Note that


and are same in each trial and outcomes of different trials are independent.

Bernoulli distribution: In a Bernoulli experiment, if a r.v. is defined such that it


takes value with probability when occurs and with probability when
occurs, then we say that follows Bernoulli distribution and its p.m.f. is given by

Examples:

1) Tossing of a coin (results a head or tail)


2) Performance of a student in an examination (results pass or failure)
3) Sex of an unborn child (results female or male)

Mean and Variance:

Mean

and

Variance

Binomial Distribution: Suppose we conduct independent Bernoulli trials and we


define

number of successes in trials.

Then is a discrete random variable and it takes the values 0,1,2,…, .

Derivation of : Note that means that there are successes and


failures in trials in a specified order (say) .

158
Special Probability Distributions Discrete Probability Distributions

Since outcomes of different trials are independent, by Multiplication Theorem, we


have

But successes in trials can occur in orders and the probability for each of
these orders is same, viz., . Hence by addition theorem of probability

Definition: A r.v. is said to follow a binomial distribution with parameters and


if its p.m.f. is given by

Notation: . Read as follows binomial distribution with parameters


and .

Real life examples:

1) Number of heads in tosses of a coin


2) Number of boys in a family of children
3) Number of times hitting a target in attempts

Note:

1.

159
Special Probability Distributions Discrete Probability Distributions

2. The c.d.f. of is given by

Example 1: Four fair coins are tossed. If the outcomes are assumed to be
independent, then find the p.m.f. and c.d.f. of the number of heads obtained.

Solution: Let be the no. of heads in tossing 4 coins.

Then where .

Thus

for .

Then

The p.m.f and c.d.f are given in the following table.

160
Special Probability Distributions Discrete Probability Distributions

Example 2: and play a game in which their chances of winning are in the
ratio Find A’s chance of winning at least three games out of the five
games played.

Solution:

Define No. of games winning out of 5.

Here and and . Thus,

for .

Required to find:

Example 3: The probability of a man hitting a target is .

(i) If he fires 7 times, what is the probability of his hitting the target at
least twice?
(ii) How many times must he fire so that the probability of his hitting the
target at least once is greater than ?

Solution: Let be the no. of times a man hitting the target in fires. Here
and . Then and

for .

161
Special Probability Distributions Discrete Probability Distributions
(i)

(ii) Find such that

,since cannot be fractional, the


required number of shots is .

Mean of Binomial Distribution:

Variance of Binomial Distribution:

(See for proof)

162
Special Probability Distributions Discrete Probability Distributions

Example 4: One hundred balls are tossed into 50 boxes. What is the expected
number of balls in the tenth box.

Solution: If we think of the balls tossed as Bernoulli trials in which a success is


defined as getting a ball in the tenth box, then . If denotes the number of
balls that go into the tenth box.

Then and .

Example 5: The mean and variance of binomial distribution are and


respectively. Find .

Solution: Here . But and .

Hence and .

Thus, and hence

Poisson Distribution:

If and such that fixed, then


which is the p.m.f. of Poisson distribution (See ). Thus the p.m.f. of Poisson
distribution is obtained as the limit of p.m.f. of binomial distribution.

Definition: A r.v. is said to follow a Poisson distribution with parameter if its


p.m.f. is given by

Notation: Read as: follows poisson distribution with parameter .

163
Special Probability Distributions Discrete Probability Distributions

Real life examples

1) Number of defectives in a packet of blades.


2) Number of telephone calls received at a particular telephone exchange in
some unit of time.
3) Number of print mistakes in a page of a book.
4) The number of fragments received by a surface area ‘ ’ from a fragment atom
bomb.
5) The emission of radio active (alpha) particles.
6) Number of air accidents in some unit of time.

Note :

 
e  x 
x  x  xi 
1.  p  x   e 

 e e  1 e   
 

x 0 x 0 x! x 0 x !  i 0 i ! 

2. The c.d.f. of is given by

Example 6: Messages arrive at a switchboard in a Poisson manner at an average


rate of six per hour. Find the probability for each of the following events:

a) Exactly two messages arrive within one hour.


b) No message arrives within one hour.
c) At least three messages arrive within one hour.

Solution: Let be the r.v. that denotes the number of messages arriving at the
switchboard within a one-hour interval. Then and its p.m.f is given by

for

164
Special Probability Distributions Discrete Probability Distributions

a) .

b) .
c)

Example 7: In a book of 520 pages, 390 typo-graphical errors occur. Assuming


Poisson law for the number of errors per page, find the probability that a
random sample of 5 pages will contain no error.

Solution:

Average number of typo-graphical errors/page

Let Number of errors per page

Then and

(No error)

(A random sample of 5 pages contain no error)

Mean of Poisson Distribution : The mean of poisson distribution is given by

Variance of Poisson Distribution:

165
Special Probability Distributions Discrete Probability Distributions

The variance of Poisson distribution is given by

Note that for Poisson distribution, mean and variance are equal.

Example 8: If and are independent Poisson variates such that


and , find the variance of .

Solution: Let and . Then

for and

for .

Since ;

( is not admissible)

Since , then

166
Special Probability Distributions Discrete Probability Distributions

Negative Binomial (or Pascal) Distribution:

Let denote the number of failures before the th success in a sequence of


Bernoulli trials. Then the number of trials required is .

Derivation of :

In trials, the last trial must be a success whose probability is . In the


remaining trials, we must have successes whose probability is
(Using binomial distribution).

Thus, by multiplication theorem, we have

Definition: A random variable is said to follow a Negative binomial


distribution(NBD) with parameters and if its p.m.f is given by

Notation: .

Note:

1.

Thus, the p.m.f. of NBD can be written as

167
Special Probability Distributions Discrete Probability Distributions

Further , it is the term in the expansion of a binomial


expansion with negative index. Therefore, the distribution is known as negative
binomial distribution.
   r 
2.      x   q   p r 1  q   p r p r  1
x r
p x  p r

x0 x 0  

Mean of NBD: The mean of NBD is given by

Variance of NBD: (See for proof)

3. Notice that and this implies that mean is smaller than variance in
NBD.
th
4. If Number of trials required to get success, then and

for and and

Real life examples

1) Number of tails before the third head.


2) Number of girls before the second son.
3) Number of non-defectives before the third defective.

168
Special Probability Distributions Discrete Probability Distributions

Example 9: Find the probability that there are two daughters before the second
son in a family when probability of a son is 0.5.

Solution: Let the number of daughters before second son

Then

and

Example 10: An item is produced in large numbers. The machine is known to


produce defectives. A quality control inspector is examining the items by
taking them at random. What is the probability that at least items are to be
examined in order to get defectives?

Solution: Let No. of items to be examined in order to get defectives. Here


(defective) .

Then

We want to find

Geometric distribution:

Let denotes the number of failures before the first success in a sequence of
Bernoulli trials. Then the required number of trials is .

Geometric distribution is a particular case of negative binomial distribution


with .

169
Special Probability Distributions Discrete Probability Distributions

Definition: A random variable is said to follow a Geometric distribution (GD)


with parameter if its p.m.f. is given by

Notation:

Mean and variance of GD

and (take in and of NBD)

Note: Let Number of trials required to get first success, then and

Further, and .

Real life examples

1) Number of tails before the third head


2) Number of girls before the second son
3) Number of non-defectives before the first defective

Example 11: find the probability that there are two daughters before the first
son in a family where probability of a son is 0.5 .

Solution: Let Number of daughters before the first son.

Then for

and

Hyper geometric Distribution:

Consider an urn with balls, of which are white and are red. Suppose
we draw a sample of balls at random with replacement. Let denote the

170
Special Probability Distributions Discrete Probability Distributions

number of white balls in the sample. Then where which


remains same for all trials and outcomes of different trials are independent. The
p.m.f of is given by for .

If the sample is selected without replacement, is not same for all trials and
outcomes of different trials are not independent and hence binomial distribution
cannot be applied.

Derivation of : The number of all possible samples without


replacement .

The number of samples in which there are white balls and


red balls .

Thus .

Definition: A random variable is said to follow the hyper geometric distribution


if its p.m.f is given by

Example 12: A bag contains 4 white balls and 3 green balls. Three balls are
drawn. What is the probability that 2 are white.

Solution:

Number of white balls and

171
Special Probability Distributions Discrete Probability Distributions

  M  N  M   N 
 n     
  x  n  x    n   1 if min  n, M   n
x 0 N N
min  n,M      
 n  n 
 p  x  
Note: x 0   M  N  M   N 
 n  x 

  
M  x  M 
   1 if min  n, M   M
 x 0 N  N 
    
 M  M 

Mean and variance of Hyper geometric distribution:

The mean is given by and variance is given by if


(For proof, see ).

172
Special Probability Distributions Continuous Probability Distributions

4.2

173
Special Probability Distributions Continuous Probability Distributions

2.5
4.2
Continuous Probability Distributions
The continuous probability distributions are used in a number of applications in
engineering. For example in error analysis, given a set of data or probability distribution,
it is possible to estimate the probability that a measurement (temperature, pressure,
flow rate) will fall within a desire range, and hence determine how reliable an
instrument or piece of equipment is. Also, one can calibrate an instrument
(ex. Temperature sensor) from the manufacturer on a regular basis and use a probability
distribution to see of the variance in the instruments’ measurements increases or
decreases over time.

Uniform Distribution

A continuous random variable (c. r. v.) is said to have a uniform distribution over the
interval if its p. d. f. is given by

Notation: . Read as follows uniform distribution with parameters and .


It is used to model events that are equally likely to occur at any time within a given time
interval. The plot of p. d. f. is given below:

174
Special Probability Distributions Continuous Probability Distributions

The cumulative distribution function (c. d. f.) of is given by

The mean of is given by


 b b
  
x
x f  x  dx  x f  x  dx  dx
 a a b  a

 b b
  
x2
Now, x 2 f  x  dx  x 2 f  x  dx  dx
 a a ba

Thus, the variance of is given by

175
Special Probability Distributions Continuous Probability Distributions

Example 1: The time that a professor takes to grade a paper is uniformly distributed
between minutes and minutes. Find the mean and variance of the time the
professor takes to grade a paper.

Solution: Let denotes the time the professor takes to grade a paper. Then
.

and

Normal Distribution

The normal distribution was first discovered by De – Movire and Laplace as the limiting
form of Binomial distribution. Through a historical error it was credited to Gauss who
first made reference to it as the distribution of errors in Astromy. Gauss used the normal
curve to describe theory of accidental errors of measurements involved in the
calculation of orbits of heavenly bodies.

Definition: A c. r. v. is said to have a normal distribution with parameters and if


its p. d. f. is given by

The c. d. f. of is given by

x x 
 1  t   
2
 
1 
f  t  dt  exp     dt
 2  2   
  
 

Notation: . Read as follows normal distribution with parameters


and

Note:

1. The graph of is famous bell – shaped curve and is symmetric about the line
. The top of the bell is directly above . For large values of , the curve tends

176
Special Probability Distributions Continuous Probability Distributions

to flatten out and for small values of , it has a sharp peak. The curve of is
given below.

2. Whenever the random variable is continuous and the probabilities of it are increasing
and then decreasing, in such cases we can think of using normal distribution.

Real life examples:

1) The heights of students.


2) The weights of students.
3) The diameters of bolts manufactured.
4) The lives of electrical bulbs manufactured.


3. Note that
 f  x dx  1
Standard Normal distribution

If then is known as standard normal distribution with mean


with variance and we write .Its p. d. f. is given by

177
Special Probability Distributions Continuous Probability Distributions

and its c. d. f. is given by


x z
 1 
 
1
g  t  dt  exp  t 2  dt
 2   2 

Area Property of Normal Distribution

x x 
  x 
2

 
1
If , then f  x  dx  exp     dx
 2    
 
 

Let . Then .

If , then . If , then (say).

1  1 
 P    X  x1   P  0  Z  z1    1 g  z dz 
z z1

0
2
0 exp  z 2 dz
 2 
where is the p. d. f. of standard normal variate. The definite integral

 g  z  dz is known as normal probability integral and gives the area under standard
z1

0
normal curve between the ordinates at and . These areas have been
tabulated for different values of at intervals of in the table given at the end of
the module.

In particular, the probability that the random variable lies in the interval
is given by

  g  z  dz
1

1

 2 g  z  dz
1

0
(by symmetry)

(from table)

178
Special Probability Distributions Continuous Probability Distributions

Similarly,

(see table)

and

(see table)

Thus, the probability that a normal variate lies outside the range is given by

Thus, in all probability, we should expect a normal variate to lie within the range
, though theoretically, it may range from to .

The probabilities computed above are exhibited in the following figure.

179
Special Probability Distributions Continuous Probability Distributions

Note: The Gamma function defined below is used to evaluate mean and variance of the
normal distribution.

  n    e x x n1dx for
0

, where is a positive integer.

Mean of Normal distribution


1  x 
2
 1    
EX    xf  x  dx   xe 2  
dx
  2 

Let .Then , and

1 2
1  
EX      ze
z
2
dz
2 

1  
1 2
z   
1 2
z
 .
2

e 2
dz 
2

ze 2
dz   1  0

Note that the integral in first term is since total probability is one and the integral in
the second term in zero since the integral is an odd function.

Therefore, Mean

Variance of Normal distribution

1  x 
2
 1  2  2   
 x   f  x  dx   x   e
2
dx
  2 

180
Special Probability Distributions Continuous Probability Distributions

Let . Then , and

2 1 2 1
   z2 2   z2
V X    z
2
e 2 dz  0 z 2
e 2 dz
2 2

(Since the integrand is an even function)

Let and . Then

2 2
2 2
3
 dt  1
V X    2t.e t
 
t
e .t 2
dt
2 0
2t  0

(Gamma function)

Note: Standard deviation

Example 2: If is normally distributed with mean and standard deviation , then

(a) Find the probabilities of the following :


(i)
(ii) and
(iii)
(b) Find when
(c) Find and when and

Solution:
(a) it is given that and ,

(i) Let . then

(from table)

181
Special Probability Distributions Continuous Probability Distributions

(ii)
(iii)

(by symmetry)
(from table)

(b)

, where

From normal tables, corresponding to probability , value of


(approximately)

182
Special Probability Distributions Continuous Probability Distributions

Hence

(c ) We are given that and

and

and , where and

By symmetry of normal curve, . Find such that

Corresponding to probability from the normal table, we have


approximately. Thus

Similarly,

183
Special Probability Distributions Continuous Probability Distributions

Example 3: The local authorities in a certain city install electric lamps in the
streets of the city. If these lamps have an average life of burning hours with a
standard deviation of hours, assuming normality, what number of lamps might
be expected to fail

(i) in the first and burning hours?


(ii) between and burning hours?

After what period of burning hours would you expect that

(a) of the lamps would fail?


(b) of the lamps would be still burning?

Solution:

Let denote the life of a bulb in burning hours. Here and

(i) Find

, where

Out of bulbs, number of bulbs which fail in the first hours is

(ii) Find

184
Special Probability Distributions Continuous Probability Distributions

Hence, the expected number of bulbs with life between and hours of burning
life is .

(a) Let of the bulbs fail after hours of burning life. Then we have to find such
that

, where

From table corresponding to probability 0.40, we have

Thus, after hours of burning life, of the bulbs will fail.

(b) Let of the bulbs be still burning after hours of burning life. Then we have

185
Special Probability Distributions Continuous Probability Distributions

,where

From normal tables, and hence

Hence, after hours of burning life, of the bulbs will be still burning.

De Moivre-Laplace Theorem (Normal Approximation to Binomial Distribution)

Let . Then its p.m.f. is given by for . The


mean and variance of are given by and respectively. Now,
k
n
P  k1  X  k2      p x q nk for some non-negative integers and such that
2

xk  x 
1

. Since the binomial coefficient grows quite rapidly with , it is very difficult
to compute for large . In this context, normal approximation to
binomial distribution is extremely useful.

Let . If is large with neither nor close to zero, the binomial


distribution can be approximated by the standard normal distribution. Thus,

 k  np k  np  1 1

lim P  k1  X  k2   lim P  1
z2  z2

n n
Z  2 
 
2
z1 e 2
dz
 npq npq 
k  np k  np
where z1  1 and z2  2
npq npq
186
Special Probability Distributions Continuous Probability Distributions

This is a very good approximation when both and are greater than 5.

Example 4: A coin is tossed 10 times. Find the probability of getting between 4 and 7
heads inclusive using the (a) binomial distribution and (b) the normal approximation
to the binomial distribution.

Solution:

(a) Let denote the number of heads in 10 tosses. Then and


and and
10  x
 n 1   1   n 1 
7 7 x 7 10

P 4  X  7   p  x            
x4 x4  x   2   2  x4  x   2 

10  10  10  10 


    
       
4 5 6 7 792
 0.7734
1024 1024
(b) The discrete binomial probability distribution is approximated to continuous normal
probability distribution. The integers lie in the interval (3.5 to 7.5). Thus,
 3.5  5 7.5  5 
P  4  X  7   P  3.5  X  7.5   P  Z 
 25 2.5 
 P  0.95  Z  1.58   P  0.95  Z  0   P  0  Z  1.58 
 P  0  Z  0.95   P  0  Z  1.58 
 0.3289  0.4429  0.7718

187
Special Probability Distributions Continuous Probability Distributions

Exponential distribution: A c.r.v. X is said to follow exponential distribution with


parameter if its p.m.f. is given by

The c.d.f. is given by


x
 e  t 
F  x   P  X  x   0 f  t  dt   
x  t
 1  e  x
x
e dt  
   0
0

 F  x   1  e  x

Notation: . Read as follows exponential distribution with parameter .

Real life examples of exponential distribution

1. The time taken to serve a customer at a petrol pump, railway booking counter or any
other service facility.
2. The period of time for which an electronic component operates without any
breakdown.
3. The time between two successive arrivals at any service facility.

Mean and Variance of exponential distribution


 
For r  1, E  X r   0 x r f  x  dx   0 x r e x dx

Let . Then varies between to and . Then

 r 1 1
r  

E  X    0   et  r et t   dt
r  t  dt 1 
    0
  r  1
 EX r  
r!
 r (using Gamma function)
r 

Thus, mean and

Hence

188
Special Probability Distributions Continuous Probability Distributions

Therefore, and .

Example 5: Assume that the length of phone calls made at a particular telephone
booth is exponentially distributed with a mean of 3 minutes. If you arrive at the
telephone booth just as Ramu was about to make a call, find the following:

a. The probability that you will wait more than 5 minutes before Ramu is done with
the call.
b. The probability that Ramu’s call will last between 2 minutes and 6 minutes.

Solution: Let be a r.v. that denotes the length of calls made at the telephone booth.
Since the mean length of calls , the p.d.f. is given by


 1  x  x  5

a. P  X  5  5 f  x  dx  5 e 3 dx  e 3   e 3

3  5
6
1 6 x  x  2

b. P  2  X  6  2 f  x  dx  2 e 3 dx   e 3   e 3  e 2
6 

3  2

Memory lessness property of exponential distribution

The exponential distribution is used extensively in reliability engineering to model the


lifetimes of systems. Suppose the life of an equipment is exponentially distributed
with a mean of . Assume that the equipment has not failed by time . We want to find
the probability that given that for some nonnegative additional time .

Thus,

189
Special Probability Distributions Continuous Probability Distributions

This indicates that the process only remembers the present and not the past.

Example 6: In example 5, Ramu, who is using the phone at the telephone booth, had
already talked for minutes before you arrived. According to the memory lessness
property of the exponential distribution, the mean time until Ramu is done with the
call is still minutes. The random variable forgets the length of time the call had lasted
before you arrived.

Relationship between exponential and Poisson distributions


Let denote the mean number of arrivals per unit of time, say per second. Then the
mean number of arrivals in seconds is .

Let denote the number of arrivals during an interval of seconds.

Let denote the time between two successive arrivals.

If i.e., for , then

i.e., .

190
Special Probability Distributions Continuous Probability Distributions

Standard Normal Distribution Table

z .00 .01 .02 .03 .04 .05 .06 .07 .08 .09
0.0 .0000 .0040 .0080 .0120 .0160 .0199 .0239 .0279 .0319 .0359
0.1 .0398 .0438 .0478 .0517 .0557 .0596 .0636 .0675 .0714 .0753
0.2 .0793 .0832 .0871 .0910 .0948 .0987 .1026 .1064 .1103 .1141
0.3 .1179 .1217 .1255 .1293 .1331 .1368 .1406 .1443 .1480 .1517
0.4 .1554 .1591 .1628 .1664 .1700 .1736 .1772 .1808 .1844 .1879
0.5 .1915 .1950 .1985 .2019 .2054 .2088 .2123 .2157 .2190 .2224
0.6 .2257 .2291 .2324 .2357 .2389 .2422 .2454 .2486 .2517 .2549
0.7 .2580 .2611 .2642 .2673 .2704 .2734 .2764 .2794 .2823 .2852
0.8 .2881 .2910 .2939 .2967 .2995 .3023 .3051 .3078 .3106 .3133
0.9 .3159 .3186 .3212 .3238 .3264 .3289 .3315 .3340 .3365 .3389
1.0 .3413 .3438 .3461 .3485 .3508 .3531 .3554 .3577 .3599 .3621
1.1 . 3643 .3665 .3686 .3708 .3729 .3749 .3770 .3790 .3810 .3830
1.2 . 3849 .3869 .3888 .3907 .3925 .3944 .3962 .3980 .3997 .4015
1.3 .4032 .4049 .4066 .4082 .4099 .4115 .4131 .4147 .4162 .4177
1.4 .4192 .4207 .4222 .4236 .4251 .4265 .4279 .4292 .4306 .4319
1.5 .4 332 .4345 .4357 .4370 .4382 .4394 .4406 .4418 .4429 .4441
1.6 .4452 .4463 .4474 .4484 .4495 .4505 .4515 .4525 .4535 .4545
1.7 .4554 .4564 .4573 .4582 .4591 .4599 .4608 .4616 .4625 .4633
1.8 .4641 .4649 .4656 .4664 .4671 .4678 .4686 .4693 .4699 .4706
1.9 .4713 .4719 .4726 .4732 .4738 .4744 .4750 .4756 .4761 .4767
2.0 .4772 .4778 .4783 .4788 .4793 .4798 .4803 .4808 .4812 .4817
2.1 .4821 .4826 .4830 .4834 .4838 .4842 .4846 .4850 .4854 .4857
2.2 .4861 .4864 .4868 .4871 .4875 .4878 .4881 .4884 .4887 .4890
2.3 .4893 .4896 .4898 .4901 .4904 .4906 .4909 .4911 .4913 .4916
2.4 .4918 .4920 .4922 .4925 .4927 .4929 .4931 .4932 .4934 .4936
2.5 .4938 .4940 .4941 .4943 .4945 .4946 .4948 .4949 .4951 .4952

191
Special Probability Distributions Continuous Probability Distributions

2.6 .4953 .4955 .4956 .4957 .4959 .4960 .4961 .4962 .4963 .4964
2.7 .4965 .4966 .4967 .4968 .4969 .4970 .4971 .4972 .4973 .4974
2.8 .4974 .4975 .4976 .4977 .4977 .4978 .4979 .4979 .4980 .4981
2.9 .4981 .4982 .4982 .4983 .4984 .4984 .4985 .4985 .4986 .4986
3.0 .4987 .4987 .4987 .4988 .4988 .4989 .4989 .4989 .4990 .4990
3.1 .4990 .4991 .4991 .4991 .4992 .4992 .4992 .4992 .4993 .4993
3.2 .4993 .4993 .4994 .4994 .4994 .4994 .4994 .4995 .4995 .4995
3.3 .4995 .4995 .4995 .4996 .4996 .4996 .4996 .4996 .4996 .4997
3.4 .4997 .4997 .4997 .4997 .4997 .4997 .4997 .4997 .4997 .4998
3.5 .4998 .4998 .4998 .4998 .4998 .4998 .4998 .4998 .4998 .4998

192
Special Probability Distributions Central Limit Theorem

4.4

193
Special Probability Distributions Central Limit Theorem
4.4.
4.5
Central Limit Theorem
n
Let be a sequence of independent random variables. Let Sn   X i . In laws
i 1

of large numbers we considered convergence of to which is a constant


either in probability (in case of WLLN) or almost surely (in case of SLLN). Here we

consider some different situations, namely, where is a normal

variate. If the sequence is said to follow the central limit theorem


(CLT) or normal convergence. In this module we consider different Central Limit
Theorems.

Definition: A sequence of independent r.vs with mean and


is said to follow Central Limit Theorem (CLT) under certain
conditions, if the random variable is asymptotically
n n
normal (AN) with mean and variance where    i and  2   i 2 .
i 1 i 1

Notation: . Read as follows asymptotically normal with mean


and variance .

Note:

1. is asymptotically normal means follows normal distribution as .


2. If , then follows asymptotically standard normal with mean
and variance and we write .

194
Special Probability Distributions Central Limit Theorem

Variations of the CLT

The following are some variations of the CLT which are stated without proof.

Theorem 1 (De Moivre-Laplace CLT) : If is a sequence of Bernoulli trials


with constant probability of success equal to , then the distribution of the r.v.
where ’s are independent, is asymptotically normal ( ,.

is

Theorem 2 (Lindeberg-Levy CLT) : This CLT theorem is for i.i.d.r.vs.

If is a sequence of i.i.d.r.vs with mean and variance


for all , then the sum is asymptotically normal
with mean and variance .

Theorem 3 (Liapounoff’s CLT): This CLT theorem is for independent but not
identically distributed random variables.

Let be a sequence of independent random variables with mean


and variance . Let us assume that third absolute moment, say
of about its mean exists for is
n n n
finite. Let  3   i 3 ,    i and  2   i 2 . If , then the sum
i 1 i 1 i 1
n
Sn   X i is .
i 1

Example 1: If are i.i.d.r.vs with p.m.f , find the asymptotic


n
distribution of Sn   X i .
i 1

Solution: Here and

195
Special Probability Distributions Central Limit Theorem

 n  n
Let . Then E  Sn   E   X i    E  X i   0 and
 i1  i1
n n
V  Sn   V  X i   1  n .
i 1 i 1

Since mean and variance exist for , by Lindeberg-Levy CLT, or


.

Example 2: If are i.i.d. with , , and


1 n
X n   X i , then show that for any
n i 1

as .

n
Solution: Let . Then E  Sn    E  X i   0 and
i 1
n
V  Sn   V  X i   n  2 . Since are i.i.d with finite mean and variance, then
i 1
we have (by Lindeberg-Levy CLT).

Let . Then and

Thus, .

We have where

 1t 2
,where   z  
1 z

2
 e 2 dt

196
Special Probability Distributions Central Limit Theorem

for ….

But …

(A result in normal distribution)

From and , we have

Example 3: Examine if CLT holds for the sequence with p.m.f


, .

Solution: Since it is a non identically distributed sequence of r.vs, for CLT to hold,
we have to verify the Liapounoff’s condition.

We have ,

and

Further, we have

197
Special Probability Distributions Central Limit Theorem

Thus, . Thus, the Liapounoff’s condition is not

satisfied and hence we cannot say that CLT holds for .

Example 4: Examine if CLT holds for the sequence with p.m.f


.

Solution: Since it is a non identically distributed of r.vs, for CLT to hold, we have
to verify the Liapounov’s condition.

We have ,

and

Further, we have
n n
   k   0  0 ,
k 1 k 1

n n n n
     1  n and       k   1  2  ...  n
2 2
k
3 3
k
k 1 k 1 k 1 k 1

Note that

Now , if

Thus if

Therefore, holds for the sequence .

198
Special Probability Distributions Central Limit Theorem

Applications of central Limit Theorem:


In case of Bernoulli, Binomial and Poisson distributions, evaluation of probabilities
using p.m.f. are tedious. Using normal approximation for large samples to these
distributions, the probabilities can be easily evaluated.

(a) Let be a sequence of i.i.d Bernoulli variate .


Let

Then , where and

By Lindeberg Levy CLT for large ,

… (1)

Let

Then from , where is

1 b  12 z 2
2 a
Thus,  e dz

and the can be evaluated using standard normal tables for given real
numbers and .

(b) Let be a sequence of i.i.d Binomial variates .


Let

 n  n n
Then E  Sn   E   X i    E  X i    rp  n rp
 i1  i1 i 1

 n  n
and V  Sn   V   X i   V  X i 
 i1  i1

199
Special Probability Distributions Central Limit Theorem
n
  rp 1  p   nrp 1  p 
i 1

Thus and

By Lindberg – Levy CLT, for large , we have

Let

Then where
1
b  z2
Thus, 
a
e 2
dz

and the RHS can be evaluated using standard normal tables for given real
numbers and .
n
(c) Let be a sequence of i.i.d Poisson variates . Let Sn   X i
i 1

n
Here . Then E  Sn    E  X i   n
i 1

 n  n
and V  Sn   V   X i   V  X i 
 i1  i1

Thus, by Lindeberg – Levy CLT, for large ,

Let , Then , where

1
b  z2
The probabilities  a
e 2
dz

200
Special Probability Distributions Central Limit Theorem

and the RHS can be evaluated using standard normal for given real numbers
and .

Example 5: A sample of items is taken at random from a batch known to


contain defectives. What is the probability that the sample contains

(i) at least defetives,

(ii) exactly defectives?

Solution:

Let , for

It is given that

Then follows Bernoulli distribution

Let . Then

Since and

Since is large, computation of probabilities using binomial formula is difficult.


Hence, by CLT, we use normal approximation to compute the probabilities of
instead of binomial distribution.

Here and

Let

Then by Lindeberg Levy CLT, we have , where is .

201
Special Probability Distributions Central Limit Theorem

(i) It should be noted that the continuous normal distribution is approximating


the discrete binomial distribution so that the continuity correction has to
be taken into account in determining the various probabilities. So finding
the probability of at least defectives in a sample of items requires
finding the area under the normal curve from to

Therefore, the probability of at least defectives is given by

(See the standard normal distribution table)

(ii) The probability of exactly defectives is

(See table)

100
 100
Note: Using the binomial distribution, P  Sn  44     k   0.4   0.6 
k 100 k

k 44  

and (Using Binomial tables).

202
Special Probability Distributions Central Limit Theorem

As can be seen by comparing the answers, both sets of answers are remarkably
close.

Example 6: Let be i.i.d. Poisson variables with parameter . Use CLT to


estimate ,where , and

Solution: Since are i.i.d for


n n
 E  Sn    E  X i      n and
i 1 i 1

 n  n
V  Sn   V   X i   V  X i   n
 i1  i1

Hence by Lindberg – Levy CLT, for large , we have

. After applying the continuity correction, the


required probability is

where

(From standard normal table)

203
Special Probability Distributions Central Limit Theorem

4.4.Central Limit Theorem


4.5.
Exercise:
1. The lifetime of a certain brand of an electric bulb may be considered as a r.v.
with mean and standard deviation . Find the probability,
using CLT that the average lifetime of bulbs exceeds 1250 .

2. A distribution has unknown mean and variance equal to . Use CLT to


find how large a sample should be taken from the distribution in order that the
probability will be at least that the sample mean will be within of the
population mean.

3. A random sample of size is taken from a population whose mean is and


variance is . Use CLT, with what probability can we assert that the mean of
the sample will not differ from by more than .

4. The guaranteed average life of a certain type of electric light bulb is


with a standard deviation of . It is decided to sample the output so as
to ensure that of the bulbs do not fall short of the guaranteed average by
more than Use CLT to find the minimum sample size.

5. If , are independent r.vs, each having a Poisson distribution


with parameter and , evaluate , using
CLT.

204
Special Probability Distributions Central Limit Theorem

Answers:
1.
2. at least
3.
4.
5.

205
Sampling Distributions

UNIT-V: Sampling Theory

206

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy