0% found this document useful (0 votes)
26 views4 pages

STAT 102 Part 13

Uploaded by

therealrayan493
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views4 pages

STAT 102 Part 13

Uploaded by

therealrayan493
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Some important distributions

A. Discrete distributions
1. Bernoulli distribution
2. Binomial distribution
3. Poisson distribution
4. Hypergeometric distribution
5. Geometric distribution
6. Negative binomial distribution
B. Continuous distributions
1. Uniform distribution
2. Exponential distribution
3. Normal distribution
4. Gamma distribution
5. Beta distribution
Bernoulli distribution
Bernoulli trial
A trial that has two possible outcomes is known as a Bernoulli trial. The two
outcomes are called ‘success’ (probability 𝑝𝑝) and ‘failure’ (probability 1 − 𝑝𝑝).
Examples
1. A fair coin is tossed. Here, head (or tail) is a success. 𝑃𝑃(𝐻𝐻) = 𝑝𝑝 = 0.5.
2. An unfair coin is tossed. 𝑃𝑃(𝐻𝐻) = 𝑝𝑝 = 0.7 (say).
3. A fair die is tossed. 𝑃𝑃(6) = 𝑝𝑝 = 1/6.
4. A patient is tested for COVID-19. 𝑃𝑃(+) = 𝑝𝑝 = 0.05 (say).
Bernoulli random variable
Let a Bernoulli trial be performed once. Then, 𝑆𝑆 = {𝑠𝑠, 𝑓𝑓}. Let 𝑋𝑋 be the number of
successes obtained. Then, 𝑋𝑋 is a Bernoulli random variable with possible values ‘0’
and ‘1’. It has the following pmf:
𝑥𝑥 0 1
𝑃𝑃(𝑋𝑋 = 𝑥𝑥) 1 − 𝑝𝑝 𝑝𝑝
We can write the above pmf as follows:
𝑃𝑃(𝑋𝑋 = 𝑥𝑥) = 𝑝𝑝 𝑥𝑥 (1 − 𝑝𝑝)1−𝑥𝑥 ; 𝑥𝑥 = 0, 1.
Note
Here, 𝑝𝑝 is the ‘parameter’ of the probability distribution. It characterizes the
probability distribution (population). We write,
𝑋𝑋 ~ Bernoulli(𝑝𝑝).
That is, 𝑋𝑋 follows Bernoulli distribution with parameter 𝑝𝑝.
Mean and variance
𝐸𝐸(𝑋𝑋) = 0 × (1 − 𝑝𝑝) + 1 × 𝑝𝑝 = 𝑝𝑝
𝐸𝐸(𝑋𝑋 2 ) = 02 × (1 − 𝑝𝑝) + 12 × 𝑝𝑝 = 𝑝𝑝
𝑉𝑉(𝑋𝑋) = 𝑝𝑝 − 𝑝𝑝2 = 𝑝𝑝(1 − 𝑝𝑝)
Note that, mean > variance.
Also, 𝑉𝑉(𝑋𝑋) ≤ 0.25. (Why?)
Example
In a multiple-choice question (MCQ), there are four choices. A student does not
know the correct answer and selects one of the choices at random. Let 𝑋𝑋 be the
number of correct answers (out of 1 question). Then
𝑋𝑋 ~ Bernoulli(𝑝𝑝 = 0.25).

Binomial distribution
Binomial experiment
A binomial experiment consists of 𝑛𝑛 independent Bernoulli trials, where 𝑛𝑛 is a
positive integer. The probability of success (𝑝𝑝) is same in each trial.
Example
1. A fair coin is tossed 5 times. Here, 𝑛𝑛 = 5 and 𝑝𝑝 = 0.5.
2. A coin (probability of head 0.7) is tossed 10 times. Here, 𝑛𝑛 = 10 and 𝑝𝑝 = 0.7.
Binomial random variable
Let 𝑛𝑛 independent Bernoulli trials be performed, each having probability of success
𝑝𝑝. Let 𝑋𝑋 be the number of successes obtained. Then, 𝑋𝑋 is a binomial random variable
with possible values 0, 1, 2, ⋯ , 𝑛𝑛. It has the following pmf:
𝑛𝑛
𝑃𝑃(𝑋𝑋 = 𝑥𝑥) = � � 𝑝𝑝 𝑥𝑥 (1 − 𝑝𝑝)𝑛𝑛−𝑥𝑥 ; 𝑥𝑥 = 0, 1, 2, ⋯ , 𝑛𝑛.
𝑥𝑥
Note
The distribution has two parameters 𝑛𝑛 and 𝑝𝑝. We write
𝑋𝑋 ~ binomial(𝑛𝑛, 𝑝𝑝).
That is, 𝑋𝑋 follows binomial distribution with parameters 𝑛𝑛 and 𝑝𝑝.

Explanation of the pmf


Let 5 independent Bernoulli trials be performed, each having probability of success
𝑝𝑝. The sample space is as follows:
𝑆𝑆 = {𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, ⋯ , 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, ⋯ , 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓, ⋯ , 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓}
Since the trials are independent, we have
𝑃𝑃(𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠) = 𝑝𝑝 × 𝑝𝑝 × 𝑝𝑝 × 𝑝𝑝 × 𝑝𝑝 = 𝑝𝑝5

𝑃𝑃(𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠) = 𝑝𝑝 × 𝑝𝑝 × 𝑝𝑝 × (1 − 𝑝𝑝) × (1 − 𝑝𝑝) = 𝑝𝑝3 (1 − 𝑝𝑝)2

𝑃𝑃(𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠) = 𝑝𝑝 × (1 − 𝑝𝑝) × (1 − 𝑝𝑝) × 𝑝𝑝 × 𝑝𝑝 = 𝑝𝑝3 (1 − 𝑝𝑝)2

𝑃𝑃(𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓) = (1 − 𝑝𝑝) × ⋯ × (1 − 𝑝𝑝) = (1 − 𝑝𝑝)5
Probability that we will get 3 successes:
𝑃𝑃(𝑋𝑋 = 3) = 𝑃𝑃(𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 or 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 or ⋯ or 𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓𝑓)
= 𝑝𝑝3 (1 − 𝑝𝑝)2 + 𝑝𝑝3 (1 − 𝑝𝑝)2 + ⋯ + 𝑝𝑝3 (1 − 𝑝𝑝)2
5
= � � 𝑝𝑝3 (1 − 𝑝𝑝)2
3
Note
(𝑝𝑝 + (1 − 𝑝𝑝))𝑛𝑛 = 1
If we expand the left-hand-side of the above equation (binomial expansion), we get
the sum of all the probabilities of binomial distribution, which equals one.
Note
Let 𝑋𝑋 ~ binomial(𝑛𝑛, 𝑝𝑝). Then
𝑋𝑋 = 𝑌𝑌1 + 𝑌𝑌2 + ⋯ + 𝑌𝑌𝑛𝑛
where 𝑌𝑌𝑖𝑖 = number of success in the 𝑖𝑖th trial, 𝑖𝑖 = 1, 2, ⋯ , 𝑛𝑛. That is,
𝑌𝑌𝑖𝑖 ~ Bernoulli(𝑝𝑝). Thus, a binomial(𝑛𝑛, 𝑝𝑝) random variable is the sum of 𝑛𝑛
independent Bernoulli(𝑝𝑝) random variables.
Mean and variance
𝐸𝐸(𝑋𝑋) = 𝑛𝑛𝑛𝑛
𝑉𝑉(𝑋𝑋) = 𝑛𝑛𝑛𝑛(1 − 𝑝𝑝)
Variance is smaller than the mean.
Exercise
In an MCQ test, there are 10 questions each with 4 choices. Smith chooses all the
answers at random. What is the probability that the number of correct answers will
be (i) exactly 2 (ii) less than 2 (iii) 2 or less?
Solution
Let 𝑋𝑋 = number of correct answers. Then
𝑋𝑋 ~ binomial(𝑛𝑛 = 10, 𝑝𝑝 = 0.25).
10
(i) 𝑃𝑃(𝑋𝑋 = 2) = 𝑃𝑃(2) = � � 0.252 (1 − 0.25)10−2 = 0.2816
2
(ii) 𝑃𝑃(𝑋𝑋 < 2) = 𝑃𝑃(0) + 𝑃𝑃(1) = 0.2440

(iii) 𝑃𝑃(𝑋𝑋 ≤ 2) = 𝑃𝑃(0) + 𝑃𝑃(1) + 𝑃𝑃(2) = 0.5256

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy