0% found this document useful (0 votes)
0 views6 pages

DiscreteProbabilityDistributionFunctions part 2

The document discusses discrete and continuous probability distribution functions, focusing on the Poisson distribution, its properties, and examples of its application. It also covers uniform and exponential distribution functions, including their definitions, theorems, and moment generating functions. Key results include the mean and variance for each distribution type, along with practical examples illustrating their use in real-world scenarios.

Uploaded by

isaac rubia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views6 pages

DiscreteProbabilityDistributionFunctions part 2

The document discusses discrete and continuous probability distribution functions, focusing on the Poisson distribution, its properties, and examples of its application. It also covers uniform and exponential distribution functions, including their definitions, theorems, and moment generating functions. Key results include the mean and variance for each distribution type, along with practical examples illustrating their use in real-world scenarios.

Uploaded by

isaac rubia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

CSC 124 Lec 9 Probability and Statistics

1 Discrete Probability Distribution Functions


1.1 Poisson Distribution Function
Distribution due to Simeon D. Poisson (1781 - 1840)
Definition:
A random variable X is defined to have a Poisson distribution if its p.d.f is
given by

e−λ λx

x! , x = 0, 1, · · ·
f (x) =
0, elsewhere
where the parameter λ satisfies λ > 0.
Note:
We know that

X λx
eλ = ,
x=0
x!
hence
∞ ∞ ∞
X X e−λ λx X λx
f (x) = = e−λ = e−λ eλ = 1
x=0 x=0
x! x=0
x!
Theorem
Let X be a Poisson distributed random variable. Then E(X) = λ, V ar(X) =
t
λ and M (t) = eλ(e −1)
Proof:
The moment generating function, mgf , denoted as M (t), by definition

1
X
M (t) = E(etx ) = etx f (x) (1)
x=0
n
X e−λ λx
= etx
x=0
x!
∞ x
X (λet )
= e−λ
x=0
x!
t t
= e−λ eλe = eλ(e −1)

1
Then, the mean and variance from the moment generating function, M (t),
becomes
Mean,

d −λ λet
M (1) (t) = e e (2)
dt t=0
d  t
= e−λ eλe
dt t=0
t d t
e−λ eλe λet = λe−λ et eλe

=
dt t=0 t=0
(1)
=⇒ E(X) = M (t = 0)
= λe−λ eλ
= λ

For the variance, we get the 2nd moment from the mgf as

d2 t
M (2) (t) = 2
λe−λ et eλe (3)
dt  t=0

−λ t λet t
= λe e e + et λet eλe
t=0
t
λe−λ et eλe 1 + λet

=
t=0
⇒ E(X 2 ) = M (2) (t = 0)
= λ + λ2

Thus,

 2
V ar(X) = M (2) (t = 0) − M (1) (t = 0) (4)
2
= E X 2 − (E(X))


= λ + λ2 − λ2
= λ

Alternatively, without making use of the mgf ;


By definition,

n
X
E(X) = xf (x) (5)
x=0

X e−λ λx
= x
x=0
x!

2

X e−λ λx
= x
x=0
x!

X λx−1
= e−λ λ
x=1
(x − 1)!
| {z }
Sums to eλ
= λ

and by letting X 2 = X(X − 1) + X, then

n
X
E(X 2 ) = x2 f (x) (6)
x=0

X e−λ λx
= [x(x − 1) + x]
x=0
x!
∞ ∞
X e−λ λx X e−λ λx
= x(x − 1) + x
x=2
x! x=1
x!
∞ ∞
X λx−2 X λx−1
= e−λ λ2 +e−λ λ
x=2
(x − 2)! x=1
(x − 1)!
| {z } | {z }
Sums to eλ Sums to eλ
2
= λ +λ

leading to

2
E X 2 − (E(X))

V ar(X) = (7)
2 2
= λ +λ−λ
= λ

as obtained in 4 above.

1.1.1 Examples
1. The number of deaths by suicide in a certain institution is distributrd as
a poisson process with parameter λ = 2 per day. Let X be the number
of deaths by suicide per day. Find (i) P (X = 2) (ii) P (X = 0) (iii)
P (X ≥ 3). Solution:

e−2 22
P (X = 2) = = 2e−2 = 0.2707
2!
e−2 20
P (X = 0) = = e−2 = 0.1353
0!

3
P (X ≥ 3) = P (X = 3) + P (X = 4) + · · ·
= 1 − {P (X = 0) + P (X = 1) + P (X = 2)}
= 1 − (0.1353 + 0.2707 + 0.2707)
= 0.3233

2. If the average number of deaths per day is λ = 2, then the average number
of deaths per week λ1 = 7λ = 7 × 2 = 14 deaths per week. If we let X to
be the number of deaths per week, then
 −14 x
e 14
, x = 0, 1, 2, · · ·
f (x) = x!
0, elsewhere

Find P (X = 0), P (X = 1), P (X = 2) and P (X = 3). Solution:


e−14 140
P (X = 0) = = e−14 = 0.000000831
0!
e−14 141
P (X = 1) = = 14e−14 = 0.00001164
1!
e−14 142
P (X = 2) = = 7 × 14e−14 = 0.0000815
2!
e−14 143 143 −14
P (X = 3) = = e = 0.00038
3! 6
3. A controlled manufacturing process is 0.1% defective. What is the prob-
ability of taking 3 or more defectives from a lot of 100 pieces when you
use, (i) Binomial distribution (ii) Poisson distribution. Solution: (i) p =
Probability of defective = 0.002, q = 1 - p = 0.998 and n = 100
P (3 or more def ectives) = P (3) + P (4) + · · · + P (100)
⇒ P (x
≥ 3) = 1− {P (x = 0) + P (x
 = 1) +
 P (x = 2)}   
100 0 100 100 1 99 100 2 98
=1− 0.001 0.999 + 0.001 0.999 + 0.001 0.999
0 1 2
= 1 − {0.9048 + 0.09048 + 0.004488}
= 1 − 0.999768
= 0.000232

(ii) Poisson: Probability of finding 2 or more defective

Here, λ = np
0.1
= 100 × 100 = 0.1

P (3 or more def ectives) = P (3) + P (4) + · · · + P (100)


P (x ≥n3) = 1 − {P (x = 0) + P (x = 1)o+ P (x = 2)}
−0.1 −0.1
0 1
e−0.1 0.12
= 1 − e 0!0.1 + e 1!0.1 + 2!
= 1 − e−0.1 {1 + 0.1 + 0.005}
= 1 − 0.99985
= 0.00015

4
2 Continuous Probability Distribution Functions
• Uniform distribution functions

• Exponential distribution function


• Normal distribution functions

2.1 Uniform distribution function


Definition:
A random variable X is defined to have a Uniform distribution if the prob-
ability distribution function of X is given by
 1
f (x) = b−a , a < b
0, elsewhere

In this case, X is defined over the interval (a, b).


Theorem:
(b−a)2
If X has a Uniform distribution, then E(X) = a+b 2 , V ax(X) = 12 and
etb −eta
the mgf is Mx (t) = t(b−a)
Proof:
R∞ 1
Rb x
E(X) = x b−a dx = b−a dx
 −∞  a
b
1 x2 a+b
= b−a 2 = 2
x=a

Rb x2
E(X 2 ) = b−a dx
 a 
b
1 x3 b2 +ab+a2
= b−a 3 = 3
x=a

⇒ V ar(X) = E(X 2 ) − (E(X))2


2 2 2 2
= b +ab+a
3 − a+b
2 = (b−a)
12

and
Rb 1
Mx (t) = E(etx ) = etx b−a dx
  a
b
1 etx tb
−eta
= b−a t = et(b−a)
x=a

5
2.2 Exponential distribution function
Definition:
A random variable X is defined to have an Exponential distribution if the
probability distribution function of X is given by
( x
1 − /β
f (x) = βe ,β > 0
0, e; sewhere

Theorem:
If X has an Exponential distribution, then E(X) = β, V ar(X) = β 2 and
1
the mgf is Mx (t) = 1−βt , t < β1
Proof:
R∞ −x/β
E(etx ) = etx β1 e dx
0
R∞ tx−x/
= 1
e β dx
β
0
" #∞1
1 e β
(t− / )x
= β 1
t− /β
x=0
= −1
βt−1
1
= 1−βt ; t < 1/β

Note:
1
If we let β = λ from the above given Exponential distribution, then the
function
λe−λx , x > 0

f (x) =
0, elsewhere
also represents an Exponential distribution where λ is a positive constant.
Where the mgf is
R∞
E(etx ) = etx λe−λx dx
0
R∞ −x(λ−t)
=λ e dx
0
h −x(λ−t)
i ∞
e
=λ λ−t
x=0
λ
= λ−t ; t<λ
with both the expectation and the variance being λ1 and λ12 respectively.
Exercise:
In both cases, show the expectation and variance, both from definition and
from the mgf.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy