0% found this document useful (0 votes)
101 views7 pages

Lecture Note-TKD-2

The document discusses key concepts in probability and statistics including: 1) The total probability rule and how to calculate the probability of an event using the probabilities of mutually exclusive and exhaustive events. 2) Bayes' theorem and how to calculate conditional probabilities. Examples are provided to demonstrate applying these concepts. 3) The difference between discrete and continuous random variables and examples of each. Probability mass functions are defined for discrete random variables. 4) Examples are given of defining and verifying probability mass functions for discrete random variables. Problems are provided for readers to practice applying these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
101 views7 pages

Lecture Note-TKD-2

The document discusses key concepts in probability and statistics including: 1) The total probability rule and how to calculate the probability of an event using the probabilities of mutually exclusive and exhaustive events. 2) Bayes' theorem and how to calculate conditional probabilities. Examples are provided to demonstrate applying these concepts. 3) The difference between discrete and continuous random variables and examples of each. Probability mass functions are defined for discrete random variables. 4) Examples are given of defining and verifying probability mass functions for discrete random variables. Problems are provided for readers to practice applying these concepts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Applied Statistics -Notes-2 Dr.

T K Datta

Total Probability Rule:


Let 𝐸1 , 𝐸2 , … . . 𝐸𝑛 be mutually exclusive and exhaustive events and B be another event. Then,
𝑃(𝐵) = 𝑃(𝐸1 ∩ 𝐵) + 𝑃(𝐸2 ∩ 𝐵) + ⋯ . 𝑃(𝐸𝑛 ∩ 𝐵)
= 𝑃(𝐸1 )𝑃(𝐵/𝐸1 ) + 𝑃(𝐸1 )𝑃(𝐵/𝐸2 ) + ⋯ . +𝑃(𝐸𝑛 )𝑃(𝐵/𝐸𝑛 ).
Example 1 (2-107): The probability is 2% that an electrical connector that is kept dry fails
during the warranty period of a portable computer. If the connector is wet, then the probability
of failure during warranty period is 6%. If 90% of the connector are kept dry and the remaining
10% are wet, what is the probability that a randomly selected connector will fail during the
warranty period?
Solution: Define: E1 – connector is kept dry; E2 – the connector is wet; F – the connector fails.
To find P(F).
Given: P(E1) = 0.9, P(E2) = 0.1, P(F/E1) = 0.02 and P(F/E2) = 0.06.
By total probability rule,
P(F) = P(E1) P(F/E1) + P(E2) P(F/E2) = 0.9*0.02 + 0.1*0.06 = 0.024.
Example 2: There are three identical boxes - Box1, Box 2 and Box 3. Each box contains red,
white and black balls. The details are given below:
BOX

1 2 3
Red 8 10 7
Ball White 4 6 5
Black 8 6 6
A box is selected at random and a ball is drawn from it. Compute the probability that the ball
drawn is white.
Solution: E1: Box 1 is selected, E2: Box 2 is selected, E3: Box 3 is selected and W: a white ball
is drawn.
P(W) = P(E1) P(W/E1) + P(E2) P(W/E2) + P(E3) P(W/E3)
= 1/3* 4/20 + 1/3*6/22 + 1/3*5/18 = 0.2502.

Bayes’ Theorem: Let 𝐸1 , 𝐸2 , … . . 𝐸𝑛 be mutually exclusive and exhaustive events and B be


another event. Then,
𝑃(𝐸𝑘 )𝑃(𝐵/𝐸𝑘 ) 𝑃(𝐸𝑘 )𝑃(𝐵/𝐸𝑘 )
P(Ek/B) = = = 𝑃(𝐸 , k = 1, 2, …, n.
𝑃(𝐵) 1 )𝑃(𝐵/𝐸1 )+𝑃(𝐸2 )𝑃(𝐵/𝐸2 )+⋯+𝑃(𝐸𝑛 )𝑃(𝐵/𝐸𝑛 )

Example 1: The probability is 2% that an electrical connector that is kept dry fails during the
warranty period of a portable computer. If the connector is wet, then the probability of failure

1
Applied Statistics -Notes-2 Dr. T K Datta

during warranty period is 6%. The 90% of the connector are kept dry and the remaining 10%
are wet. A randomly selected connector fails during the warranty period.
a) What is the probability that it was dry?
b) What is the probability that it was wet?
Solution: : Define: E1 – connector is kept dry; E2 – the connector is wet; F – the connector
fails.
To find P(F).
Given: P(E1) = 0.9, P(E2) = 0.1, P(F/E1) = 0.02 and P(F/E2) = 0.06.
P(F) = P(E1) P(F/E1) + P(E2) P(F/E2) = 0.9*0.02 + 0.1*0.06 = 0.024.
𝑃(𝐸1 )𝑃(𝐹/𝐸1 ) 𝑃(𝐸1 )𝑃(𝐹/𝐸1 )
a) By Bayes’ theorem, P(E1/F) = = 𝑃(𝐸 )𝑃(𝐹/𝐸
𝑃(𝐹) 1 1 )+𝑃(𝐸2 )𝑃(𝐹/𝐸2 )

0.018
= 0.024 = 0.75.
𝑃(𝐸2 )𝑃(𝐹/𝐸2 ) 𝑃(𝐸2 )𝑃(𝐹/𝐸2 )
b) By Bayes’ theorem, P(E2/F) = = 𝑃(𝐸
𝑃(𝐹) 1 )𝑃(𝐹/𝐸1 )+𝑃(𝐸2 )𝑃(𝐹/𝐸2 )

0.006
= 0.024 = 0.25.

Example 2: There are three identical boxes - Box1, Box 2 and Box 3. Each box contains red,
white and black balls. The details are given below:
BOX

1 2 3
Red 8 10 7
Ball White 4 6 5
Black 8 6 6
A box is selected at random and a ball is drawn from it. If the ball drawn is black, compute the
probability that the ball belongs to Box 3.
Solution: E1: Box 1 is selected, E2: Box 2 is selected, E3: Box 3 is selected and B: a black ball
is drawn.
P(B) = P(E1) P(B/E1) + P(E2) P(B/E2) + P(E3) P(B/E3)
= 1/3* 8/20 + 1/3*6/22 + 1/3*6/18 = 0.33535.
By Bayes’ theorem,
𝑃(𝐸3 )𝑃(𝐵/𝐸3 ) 𝑃(𝐸3 )𝑃(𝐵/𝐸3 )
P(E3/B) = 𝑃(𝐸 )𝑃(𝐵/𝐸 )+𝑃(𝐸
=
1 1 2 )𝑃(𝐵/𝐸2 )+𝑃(𝐸3 )𝑃(𝐵/𝐸3 ) 𝑃(𝐵)

= (1/3*6/18)/0.33535 = 0.3313.

2
Applied Statistics -Notes-2 Dr. T K Datta

Random variables, probability mass functions, probability density


functions:
Definition: A random variable is a function that assigns a real value to any outcome in the
sample space of a random experiment. A random variable is denoted by uppercase letters, like
X, Y, ….. and the values are denoted by the corresponding lowercase letters x, y, ……
Examples:
• Number of heads in 10 tosses of a coin.
• Number of even faces in rolling a die 5 times.
• Rainfall in a particular state during monsoon.
Types of random variables: Random variables are either discrete or continuous.
Discrete random variable: A random variable is called a discrete random variable if it can
take a value either from a finite set of values or from a countably infinite set of value.
Continuous random variable: A random variable is called a continuous random variable if
it can take any real value from a given interval, finite or infinite.
Examples: Describe whether the following random variables are discrete or continuous:
a) The time until a projectile returns to the earth. (C)
b) The number of times a transistor in a computer memory changes its state in one
operation. (D)
c) The outside diameter of a machined shaft. (C)
d) The concentration of output from a reactor. (C)
e) The number of cracks in a 1.5 km stretch of a highway. (D)

Discrete Probability Distribution

Some examples of discrete random variable:


1. A coin is tossed 8 times. X denotes the number of tails in 8 tosses. Here, X is a discrete
random variable and I can take on values 0, 1, 2, ….8.
2. Suppose 5% of the printers of a supplier are defective. Let X denotes the number of
defectives out of 10 printers sold.
Probability mass function (pmf): The probability mass function of a discrete random variable
X is denoted by f(x) or p(x) and is defined by
f(x) = P(X =x) where x is any possible value of X.

3
Applied Statistics -Notes-2 Dr. T K Datta

Properties of f(x): Let the discrete random variable X takes on values x1, x2,…,xn, then
1. f(xk) ≥ 0, k = 1, 2, …., n.
2. ∑𝑛𝑖=1 𝑓(𝑥𝑖 ) = 1.
3. 𝑃(𝑥𝑘 ) = 𝑃(𝑋 = 𝑥𝑘 ), k = 1, 2, …., n.
Discrete Probability Distribution: All values of a discrete random variable X with
corresponding probabilities constitutes the probability distribution of that discrete random
variable X. This is the list consisting of all possible X values with corresponding probabilities.
It can be represented by a table consisting of two rows showing the X values and corresponding
probabilities. Sometimes we can represent it by a function of x.
Example 1: Let X is a discrete random variable which can take on values 0, 1, 2, 3 with
corresponding probabilities 0.4, 0.1, 0.2 and 0.3, respectively. We can represent it in tabular
form as:
x 0 1 2 3
f(x) 0.4 0.1 0.2 0.3
This table defines the probability distribution of the discrete random variable x.
𝑥2
Example 2: f(x) = , x = 1, 3, 5.
35

Problem 1(3-15): Verify that the following function f(x) is a probability mass function. Also,
find the requested probabilities.
x -2 -1 0 1 2
f(x) 1/8 2/8 2/8 2/8 1/8

a) 𝑃(𝑋 ≤ 2); b) 𝑃(𝑋 > −2); c) 𝑃(−1 ≤ 𝑋 ≤ 1); d) 𝑃(𝑋 ≤ −1 𝑜𝑟 𝑋 = 2); e) 𝑃(𝑋 <
1 𝑎𝑛𝑑 𝑋 > −1)
Problem 2: Find the value of m for which the following function f(x) defines a probability mass
function.
x 0 1.5 4 4.8 5
f(x) 0.10 0.25 2m 0.41 m

Solution: 0.76+3m = 1. This gives, m = 0.08.


Problem 3: Find the value of m for which the following function f(x) defines a probability mass
function.
2𝑥+𝑚
𝑓(𝑥 ) = , x = 0, 1, 2, 3, 4.
25
Solution: f(0) = m/25, f(1) =(2+m)/25, f(2) = (4+m)/25, f(3) = (6+m)/25, f(4) = (8+m)/25.
20+5𝑚
f(0) + f(1) + f(2) + f(3) + f(4) = 1 implies = 1 or m = 1.
25

Problem 3 (3-22): An optical inspection system is to distinguish among different part types.
The probability of a correct classification of any part is 0.95. Suppose that three parts are

4
Applied Statistics -Notes-2 Dr. T K Datta

inspected and that the classifications are independent. Let the random variable X denote the
number of parts that are correctly classified. Determine the probability mass function of X.
Solution: Let us define the events:
C: correct part specification
I: incorrect part specification
S = {CCC, CCI, CIC, CII, ICC, ICI, IIC, III}.
The possible values of X are 0, 1, 2, 3.
P(C) = 0.95 and P(I) = 1-0.95 = 0.05.
If f(x) defines the probability mass function of X, then
f(0) = 0.053 = 0.000125, f(1) = 3*0.95*0.052 = 0.007125,
f(2) = 3*0.952*0.05 = 0.135375, f(3) = 0.953 = 0.857375.
x 0 1 2 3
f(x) 0.000125 0.007125 0.135375 0.857375
This can also be defined by
𝑓(𝑥) = 𝐶𝑟3 × 0.95𝑟 × 0.053−𝑟 , 𝑟 = 0, 1, 2, 3.

Problem 4 (3-32): The distribution of the time until change (in days) of a Web site is
approximately in the following table.

Days until Changes Probability


1.5 0.15
3.0 0.15
4.5 0.45
5.0 0.20
7.0 0.05

Let the random variable X denote the days until change. Determine the pmf of the days until
change.

Solution: Let X denotes the random variable “days until change”

The pmf is:


x 1.5 3.0 4.5 5.0 7.0
f(x) 0.15 0.15 0.45 0.20 0.05

5
Applied Statistics -Notes-2 Dr. T K Datta

Cumulative distribution function(CDF):

The cumulative distribution function (CDF) of random variable X is defined as


F(x)=P(X ≤ x), for all x ∈ R.
• 𝑃(𝑋 ≤ 𝑘) = 𝐹(𝑘)
• 𝑃(𝑚 < 𝑋 ≤ 𝑛) = 𝐹(𝑛) − 𝐹(𝑚),
• 𝑃(𝑋 > 𝑚) = 1 − 𝐹(𝑚).
CDF from PMF:
Example 1: Find the CDF of the random variable X whose pmf is given below:
x 1.5 3.0 4.5 5.0 7.0
f(x) 0.15 0.15 0.45 0.20 0.05
Use F(x) find 𝑃(𝑋 ≤ 4.5), 𝑃(𝑋 < 4.5), 𝑃(𝑋 ≥ 3.0), 𝑃(𝑋 > 4.5), 𝑃(3.0 < 𝑋 ≤ 7), 𝑃(𝑋 >
7), 𝑃(𝑋 < 1).
Solution: The cumulative distribution function F(x)of X is given below:
x 1.5 3.0 4.5 5.0 7.0
F(x) 0.15 0.30 0.75 0.95 1

𝑃(𝑋 ≤ 4.5) = 𝐹(4.5) = 0.75, 𝑃(𝑋 < 4.5) = 𝐹(3.0) = 0.30,


𝑃(𝑋 ≥ 3.0) = 1 − 𝐹(1.5) = 0.95, 𝑃(𝑋 > 4.5) = 1 − 𝐹(4.5) = 0.25,
𝑃(3.0 < 𝑋 ≤ 7) = 𝐹(7) − 𝐹(3) = 0.7, 𝑃(𝑋 > 7), 𝑃(𝑋 < 1).

Expectation, variance and standard deviation of a discrete random variable:


If the probability mass function of a discrete random variable X is f(x), then the mean or
expectation of the random variable is denoted by E(X) and is defined by
𝜇 = 𝐸(𝑋) = ∑𝑎𝑙𝑙 𝑥 𝑥𝑓(𝑥).
Variance is 𝜎 2 = 𝑉𝑎𝑟(𝑋) = ∑𝑎𝑙𝑙 𝑥(𝑥 − 𝜇)2 𝑓(𝑥) = ∑𝑎𝑙𝑙 𝑥 𝑥 2 𝑓(𝑥) − 𝜇 2 .

Standard deviation = 𝜎 = 𝑆𝐷(𝑋) = √𝜎 2 = √𝑉𝑎𝑟(𝑋).

Formula: 𝑬[𝒉(𝑿)] = ∑𝒂𝒍𝒍 𝒙 𝒉(𝒙)𝒇(𝒙).


Properties:
1. E(aX+b) = aE(X) + b.
2. Var(aX+b) = a2Var(X).
Example 1: The number of email messages received per hour has the following distribution:
x (number of messages): 10 11 12 13 14 15
f(x): 0.08 0.15 0.30 0.20 0.20 0.07
a) Find the cumulative distribution table.
b) Calculate E(X), var(X) and SD(X).

6
Applied Statistics -Notes-2 Dr. T K Datta

[Ans:
a) x: 10 11 12 13 14 15 CDF table
f(x): 0.08 0.23 0.53 0.73 0.93 1
b) E(X) = 10*0.08 + 11*0.15+ 12*0.30 +13*0.20 +14*0.20 +15*0.07
= 0.8+1.65+3.6+2.6+2.8+1.05 = 12.5.
Var(X) = E(X2) – {E(X)}2

= [102*0.08 + 112*0.15+ 122*0.30 +132*0.20 +142*0.20 +152*0.07] – 12.52


= 1.85.
SD(X) = √1.85 = 1.36.]
Example 2: Errors in an experimental transmission channel are found when the transmissions
are checked by a certifier that detects missing pulses. The number of errors found in a eight-
bit byte is a random variable X with the following distribution:
F(x) = 0 when x<1
= 0.7 when 1 ≤x < 4
= 0.9 when 4≤ x < 7
= 1 when 7≤ x.
Determine each of the following probabilities:
(a) P(X ≤ 4) (b) P(X > 5) (c) P(X ≤5) (d) P(X > 7) (e) P(X ≤ 2).

[Solution: a) P(X ≤ 4) = F(4) = 0.9; b) P(X > 5) = 1- F(5) = 1 – 0.9 = 0.1;


(c) P(X ≤5) = F(5) = 0.9; (d) P(X > 7) = 1 – F(7) = 1 – 1 = 0;
(e) P(X ≤ 2) = F(2) = 0.7.]

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy