Adobe Scan 30-Mar-2025
Adobe Scan 30-Mar-2025
Contents .
7.1 Poisson Distribution
7.2 Moment Generating Function (M.G.F.) of P(m)
7.3 Additive Property of Poisson Distribution
7.4
Recurrence Relation between Probabilities of Poisson Distribution
7.5 Poisson Distribution as a Limiting Form of Binomial Distribution
7.6 Mode of P(m)
7.7 Geometric Distribution
7.8 Recurrence Relation between Probabilities of Geometric Distribution
7.9 Distribution Function of Geometric Distribution
7.10 Lack of Memory Properiy of Geometric Distribution
7.11 Real Life Situations of Geometric Distribution
7.12 Negative Binomial Distribution
7.13 Real Life Situalions for Negative Binomial Distribution
Key Words :
Rare Event, Poisson, Geometric distribution forget fullness properly.
Objectives:
Understand the concept of discrete random variable taking countably infinite values.
Poisson distribution as the limiting form of binomial distribution.
Understand the specific situations for the use of these models.
Compute probabilities for Poisson and geometric distributions.
Understand the situations where negative binomial distribution is Used.
Learn interrelations among the different probability distributions.
(7.1)
Standard Discrete Probabilty
7.2
F.Y.B.Sc. Statistics (Paper- )
7.1 Poisson Distribution probability distribution
We are now familiar
with discrete
discrete
with concept of random variable defined on
to be of
takin
Countablyg in
ini
fini
values. In this section, we dealprobability distributions is found immensc
studying the events occurring in day to day life. Some typical probability distributionsuse t
sample space. The theory of Among
life situations and areas of research. these,
Widely applicable in real an
distribution plays an important role. where chance of occurrence of event in asho
number of situations opportunities to oce
We experience However, there are infinitely many
time interval is very
small.
follows Poisson distribution. For example;
an event
number of occurrences of such items found in a good lot of large size.
defective
(a) Number of
componentsTalls.
(b) Number of times system of
in a queue.
(c) Number of persons standing snake-bite in a certain village.
deaths due to
(d) Number of
to radiation, residing near a nuclear power plan
affected due
(e) Number of persons applications of Poisson distribution. Poike
give idea about
The examples stated abovestatistical quality control, especially in constructing conhy
distribution is applicable in
charts and acceptance sampling plans. reliability.
It is also applicable in the theory of
It is found to be useful in theory of queues. (April 2012
Definition of Poissun distribution:
variable X taking values 0, 1, 2 ... is said to follow Poisson
A discrete random function (p.m.f.) is given by
distribution with parameter mif its probability mass
em x=0, 1,2
P[X=x]= x!
m >0
= 0 otherwise
Note:
1. X follows Poisson distribution with parameter m is symbolically written a
X’P(m). Its p.m.f. is denoted by P(x).
2. We require to use the following results repeatedly in the further derivations.
m² m
em = 1+ m+ t 3t..
a' a
log, (1 -a) = -a+ t.it lal<l
3. We shall verify that P(x) is ap.m.f.
em > 0
(i) P(x) 20, Vx
KY8SCStatistics (Paper- ) 7.3 Standard Discrete Probability ...
e-m my
(i) =0
X!
x=0
=e-m m
x=0
x!
= en1+m+t..
= em em = 1
Mean and variance:
zero)
X!
(Since term corresponding to x = 0 is
X=1
= memem =m
E(X) = m
Melkno
L, =E (X) = x² P(x)
X=0
CO-1)
= X=0
[x(-)+x
1)
e"m
= x(x-1) emx! m' +X
x=0
x=0
m-2
= en m²K-2=0X-2)! + E(X)
= e-n m² em + m
+m-m²= m
,= Var (X) = E (X*) -[E(X)]² = m²
Var (X) = m
are equal and each is equal
Note : Thus, the mean and variance of Poisson distribution
to the parameter of the distribution.
llustrative Examples
Example 7.1 : The average number of misprints per page ofa book is 1.5. Assuming the
distribution of number of misprints to be Poisson, find
0) the probability that a particular book is free from misprints.
W) number of pages containing more than one misprint if the book contains 900 pages.
Solution : Let X: Number of misprints on a page in the book.
Given: X ’P (m= 1.5) E(X) =m=l.5.
Standard Discrete Probability
7.4
F.Y.B.Sc. Statistics (Paper- )
Here the p.m.f. is given by, e-l5 (1.5)
P[X=x] =T x!
e-1.5 (1.5) =e-l5= 0.223130 (From statistical tah)
P[X=0] 0!
... 15.0 are given in the statisticl
(i) 0.1, 0.2, 0.3
m=
probabilities for
Note : The
Poisson
tables. P[X> 1] = 1-P[X<1]0) + P(X= 1))
(ii) = 1-(P(X=e-15(1.5)
= 1-1e-s4 1! From statistic
1-f0.223 130 + 0.334695} tables
=
= 0.442175 misprint.
more than one
pages in the book containing (900) (0.442175)
.". Number of (900) P [X > 1 =
=
= 397.9575 398
OK
Cancel
function
Helponthua
Eio. 7.1
7.6
F.Y.B.SC. Statistics (Paper- )
Standard Discrete
() P[3sXsS] = P3) +P(4) +P(5)
0,140374+ 0.175467+0.175467
0.491308
(i) P(X> 3] = 1-P[XS3]1
=1-[P(0) +P(1) +P(2) +P(3))
0.734974
Probability bar diagram of Poisson distribution with m =5 0.2001
0.1951
Y Y
0.18 0.1404 1462 0.21
0.17 0.20 0.1539
0.16 0.19 o0.1521
0.15 0.18
0.14 0.17
0.13 0.10 0.16|
0.12 0.0842 0.15
P(X) 0.11| 0.14 0.0989
P(X) 0.13
0.10 0.0653 0.12 0.0789
0.09 0.11
0.08 a0.10
0.07 0.0337 0.09 0.0551
0.06 0.08
0.05 -0.0181 0.07
0.0269
0.04 0067 0.06 0.0202
0.03 0.05
0.04 e0.0045
0.023 0.03 S
0.01 0.02
0.01
0, 0)0 3456789 10 11 1213 14 X (0,
0)|0123 4 56 789 10 11 12
Probability histogram af Poisson Probability histogram of Poisson
distribution with m=5 distribution with m=3.9
Fig. 7.3 (a)
The p.m.f. P (x) Vs x of
Fig. 7.3. Poisson (m = 5) is plotted in theFig. 7.3 (b)
form of a bar diag
7.2 Moment Generating Function
Suppose X- P(m), then (M.G.F.) of P(m)
My (t) = E(c)
=2e P()= 5 m
X!
(me)' en
0 Behy (me'
eme 40 X!
M(0) =emeb|
EY.B.Sc. Statistics (Paper - m 7.7 Standard Discrete Probability ...
Deduction of raw moments:
First four raw moments can be obtained by expanding m.g.f. My () in powers of t as
follows:
My (t) = em(e'- 1)
= ea
where, a=m(t +5t..)
a a
= I+a+ t3t 4! +......
= 1+mt + m m² m
m² m² 3m m
+t+3 t312! t 41) t+
= coefficient of t= m
, = coefficient of = m+ m?
3
= coefficient of aT= m+3m² + m'
H = , -(H) = m² +m-(m)² = m
Hs =4-3u, H, + 2(,)
= (m + 3m² + m) -3 (m²+ m) m + 2 (m)³ = m
t=0
m(e- 1)
me e
m
dt?
[d m(e 1)
d me e t=0
t=0
= Lm (me + 1) e t=0
= m (m + 1) =m² + m
H= k, +3 k; = m+3m²
Note : Obtaining central noments of Poisson r.v. using cumulants is the easiest method.
oefficients of skewness and kurtosis :
m² m
B, = m
Y=
Remark 1:IfX, and X, are independent Poisson variates, one may be qurious abouw a
distribution of X-X. It is worth noting that X, X, does not follow Poisson distribution
is obvious because X, X, can take negative integral values, whereas a Poisson variate ne
takes negative values.
Result : If X, and X, are two independent Poisson variates with parameters m, and n
respectively, then conditional distribution of X, given X + X, = n is binomial wi
m
parameters n and n being non-negative integer.
m, + m,
Proof : Since X, and X, are independent Poisson variables, X, + X, has Poison
distribution with parameter m, + m,.
-(m, + M2) (m, + m)
P [X, + X, =n] = n!
. P(A/B) = PP(B)
(AnB) ;P (B)÷0
m
PIX, =kX,+X; =n] = m, +m,
k=0, I.. n
= p.m.f. ofBn m+ m,
n m, m,
Note : E(X,X, +X,=n) =n m, + m, Var (X,IX,+ X,=n) (m, + m
Eample 7.3 :Suppose X; ’ P(m = 1.2 i) for i =1, 2,3, 4 and X; are independent of
oher. If'S = X, find ) P(S=12), (Gi) P(S> 3), (i) P(S -2<2)
Solution : X;’ Poisson (m; = 1.2 i) and X; ar independent r.v. By additive property
4 4
0.2 = (0.03)
0.4 40
m = 0.033
: By comparison,
3
5=10
m = 1.5
- - ) - ) . . 1 )(np) (1 - p) -x
Taking limit asn ’ , wehave
lim P(x) = lim
n’0
(1-)1-)...1-D
Im) -X
m
where, D = m1 n/
p= n
m) n-x lim
n
lim1-n
e-m
Now. n m
(1-)
() (1) ... (1)
lim p(x) = mX e m where, m is consta
n0
e-m mx
X=0, 1, 2...
x!
which is the p.m.f. of Poisson distribution.
Example 7.7 : A manufacturer of cotter pins knows that 1%o of his product is defective
He sells the cotter pins in boxes of 100 pins cach and guarantees that not more than 2 pins
abox will be defective. Find the probability that a box will meet the guarantee.
Solution: LetX: Number of defective cotter pins in a box of 100 pins.
Cotter Pins P : Probability of getting defective cotter pin
X ’ B [n= 100, p = 0.01]
Defective Non-defective PoisM
Since sufficiently large (n > 20) and p is very small (< 0.05)
n is by
approximation to binomial distribution.
Here n= 100. p= 0.01, m= np 1.
X ’ Poisson (m= 1)
EY.B.SC. Statistics (Paper- ) 7.15 Standard Discrete Probability .
Hence p.m.f. is given by
P(X =x]= e1 =0, 1,2 ..
Pla box will guarantee)
= P(Xs2)
= P[X= 0] + P [X = 1]+P [X=2]
+ +
0! 1! 2!
= 0.367879 + 0.367879 + 0.183940= 0.919698
Example 7.8 : 5% of the families in Kolkata do not use gas as a fuel. If a sample of 50
families are selected at random in Kolkata, what will be the probability that less than 4
families in the sample do not use gas as a fuel ?
Solution : Suppose X : number of families not using gas as fuel out of 50 selected
families.
p=Probability of not using gas as fuel = 0.05.
.X B|n = 50, p=0.05]
We observe that n is large (n > 20) and p is small. Hence using Poisson approximation
to binomial distribution,
X’ P [m = np =(50) (0.05) = 2.5]
.: The p.m.f. is given by
e25 (2.5)
P [X = x] = x!
;X=0, 1,2..
F.Y.B.Sc. Statistics
(Paper - I)
7.16
Probabil
lustrative Examples cases in 432 municipal
distribution: distribution of meningitis
Fitting of Poisson Mumbai, the
Example 7.9: In 3 2
4 5 and above
follows:
in 1996 was as 0 4 (0
Number of cases (X) 48 15
142 frequenciec
wards () 223 expected
Number of
the above data and obtain
distribution to
Fita Poisson distribution means estimating the parameters and fire
Poisson or using the p.m.f. of Po
Solution : Fitting avariable for given data assuming follows :
the probabilities of
the
Poisson distribution is as
of fiting
distribution. The procedure e MmX ;X=0, 1, 2,..
p.m.f. is P[X = x] = x!
m>0
: otherwise
=0
estimated by
Step 1: The parameter mis [: E(X)=1
m = X
= 0.6921296
P (X = 0]) = PO) = e-m = 0.5005
Step 2 : relation
Step 3:Using the recurrence
m
P[X=X + 1] = x+1 P[X= x],
Find the remaining probabilities.
Step 4: Prepare the table as follows :
m Expected freg. = NP)
X=x m 0.6921296 P(X= x] =P()x+1
X+1 x+1
0.6921296 216.216
0.500500 = P(0)
0.3460645 0.346411- p$) 149.650
2 0.2307097 51.7884
0.119881-p
3 0.1730323 0.027658 11.9481
4 0.1384258 0.004786 2.0674
5 and above 0.000765 0.3307
Total 432
Note : The column of P(x) is to be obtained as follows : recurrn
We first write the value of PO) in front of X = 0. P ) is obtained by using
relation between the probabilities. Hence P() = PO). Similarly P2), P(3) andal
remaining values in the column are caleulated.
eVB.SC. Statistics
(Paper - ) 7.17 Standard Discrete Probability .
Miscellaneous Problems:
yample 7.10 : If X and Y are two independent Poisson variates such that
80
and P( 2IX + Y=
var(X +
Y)=3 (X= 6) =AR .find means of Xand Y.
Solution :Suppose X ’P (m) and Y’ P (m,)
Xand Yare independent. Hence, by additive property of Poisson distribution.
X+Y ’ P(m, t m)
Var (X + Y) = m, + m, = 3
The conditional distribution of Xgiven X+ Y=n is,
m
B nm+ m? Here, n=6
80
P(X= 2IX +Y=6) =-6Givo)
(3 - m,) 80
81 243
m(3m,) 4 = 16
m,-(3-m)² = 4
m6m +9 m, 4=0
(m, - )² (m,-4) = 0
Hence, m =1 or m, = 4
m, = 1
Example 7.11 : Let X and Y be two independent Poisson random variables with means
4 and 6 respectively. Find
(P.U. 1999]
6) Standard deviation of X+ Y, (i) P[Y = 3 |X +Y= 4]
Solution : (6) Since Xand Yare independent Poisson variates, by additive property,
X+Y P(m =4 +6= 10)
E(X+ Y) = 10 = Var (X + Y)
S.D. of X+Y = VI0
F.Y.B.Sc. Statistics (Paper - )
7.18
Standard Discrete Probable
The conditional distribution of Ygiven X + Y=n is Bn. m2
(ii) m + m,
p=0 6 = 0.6
Here n = 4, m, =4, m, =6,
. P[Y= 3 |X + Y = 4]
- (0.6) (0.4)
= 4x0.216 x O0.4
= 0.3456
P(m)
7.6 MODE OF
Let X’ P(m) using p.m.f. we have,
P(x) e-mm/x! m
.. (1)
P(x - 1) emm-1/(x-1)!x
We shall obtain the value of
The value of parameter m may be integral or fractional.
node in both the cases.
Case (i) : When m is not integer (fraction)
P(X) m
if x < m by (1)
>1 i.e. for x=0, 1,2,... , [m]
P(x - 1) X
Ix- 11 P(X)
= 2 P(0) + E (X- 1)
= 2e-l + E(X) 1
( E(X) =0
= 2e- +1-1
= 2e-!
Example 7.16 : If X’ Poisson (m = 3.4) find its mean, mode and mcdia.
Solution : Mean = m=34, Mode =|m] = [3.4| =3.
To find median we evaluate cumulative probabilities.
statistics(Paper--I) 7.21 Standard Discrete Probability ...
27Geometric Distribution
may get a head at
uenose a coin is tossed till a head (H) occurs for the first time, We
Let X denote the number of failures before
A I trial. 2 trial or at the s trial and so on. discrete
a hecad for the first time. We want to find probability distribution of the
geting trial is p. probability
single
njom variable X. Suppose, the probability of getting head in a
trial, q = 1-p
of getting tail in a single trials
[p+q= 1].
Assuming that the are independent, the probability distribution of X will be as
follows :
X 0 1 2 3 X
P (X) qp qp qp qP
distribution with parameter p. Since,
Such a probability distribution is called Geometric
these probabilities form a geometric progression. theory (see real
Geometric distribution is applied in the field of reliability and queueing
life situations).
Probability mass function of geometric distribution probability of success p in a single
Consider a sequence of Bernoulli trials, with constant
X represent the number of failures before the first success. If we have X failures
trial. Let corresponding sequence
before the first success, then (X + 1)h trial results into success and
will be,
FF ........... F S
Thus, the number of trials to get the first success in an infinite sequence of
trials, follows geometric distribution. Bernoul
3. We canverify that P(X= x) = 1as follows:
X=0
X PX=x) = qp
X=0 X=0 Cancela
= p(1 +q +q+q'+...)
= p(l-9) =pp CC1-x)
= 1 (1-92-Pp".
4. The probabilities P (X = x} happen to be terms in geometric series with
common
ratio q. Hence, the probability distribution is named as geometric distribution.
5. In geometric distribution P (X= x) denotes the probability that there are x failures
before the first success. P (Y = y) denotes the probability that y trials are necded for getine
the first success. Hence, it is called waiting time distribution.
6. Mode : The p.m.f. at X = 0 is p and it goes on decreasing as the
values of X
increase. Hence p.m.f. is maximum at x =0.
Mode of X=0 and Mode of Y= 1.
Mean and variance :
Var (X) =p
form of the geometric distribution :
Note : (1) Mean and variance for the other
Ifp.m.f. of y is
P[Y =yl = pq-: y= 1,2, 3,...
0<p<l, q=1-p
Similarly. Var(Y) =p
Note that Y = X+1
E() =E)+1=+1l=
:. E)=p
and Var (Y) =Var (X + 1) = Var (X)
EYRSC Statisties (Psper 724
Standard Discrete
Jesihctimpsi
- 0e p)
M, (0)= -ge
Deductions :
dM() d pqet
dt -d -gel 1-ge'yqe) =-qe'
dM)
H dt L -qe)
M() dpget from (
dt dt (1-qe')
(l-qe'y
MO)
Var () H-h
Statistics (Paper- ) 7.25 Standard Discrete Probability
ey pq- I pe'
q=l-p.then My ,() =E(ey) = 1 =1-qe
2. Altematively since Y=X +1, M, () =M,+ |() =e' M,(0) = e'el -oe
lant generating function [C.G.F.] of G (p)
We know that the m.g.f. is given by,
M, (0 = 1-ge t<-log q
1
p+q-q el
1+p P
1
M, (t) =
K,0 = +
Hence
FY.B.Se. Statistics (Paper- I)
Nature of geometric distribution :
7.26
standard Discrete Probabile
10 10 16
p=34 p= 1/2 p= 1/4
Fig. 7.4
7.3 Recurrence Relation Between Probabilities of
Distribution
We establish the relation between P (X = x) and P(X =x+ ).
Geometri.
Note that :
P(X=x) = q p 0<p<l;x=0, 1, 2, 3......
P (X = x + 1) = pq* + 1
P(X=x+1)
P (X = x)
P (X=x + 1) = q P(X = x)
This is the recurrence relation between probabilities of geometric distribu:ion If
know P (X = 0) = p, then we can obtain P (X = ), P (X = 2) and so on using
aboye
relationship.
Note : The same recurrence relation holds good for another form of p.m.f. of geometric
distribution.
7.9 Distribution Function of Geometric Distribution
=1-q if rsy<r+1