Important PMFs and PDFs
Important PMFs and PDFs
Here's a summary of the most important probability mass functions (PMFs) and probability density
functions (PDFs) commonly used in graduate-level statistics:
1. Bernoulli
PMF: P (X = x) = px (1 − p)1−x , x ∈ {0, 1}
Support: x ∈ {0, 1}, 0 ≤ p ≤ 1
Single trial with success probability p.
2. Binomial
n
PMF: P (X = x) = (x)px (1 − p)n−x ,
x ∈ {0, 1, … , n}
Support: x = 0, 1, … , n; 0 ≤ p ≤ 1
n independent Bernoulli trials.
3. Geometric
PMF: P (X = x) = (1 − p)x−1 p, x ≥ 1
Support: x = 1, 2, 3, …, 0 ≤ p ≤ 1
Number of trials until the first success.
4. Negative Binomial
PMF: P (X = x) = (x−1r−1 )p (1 − p)
r
x−r
,x≥r
Support: x = r, r + 1, …, 0 ≤ p ≤ 1
Number of trials until r -th success.
5. Poisson
λx e−λ
PMF: P (X = x) = x! , x ≥ 0
Support: x = 0, 1, 2, …, λ > 0
Number of events in a fixed interval.
1. Uniform (Continuous)
1
PDF: f (x)= b−a ,a≤x≤b
Support: a ≤ x ≤ b
Equal probability over [a, b].
2. Normal (Gaussian)
(x−μ)2
PDF: f (x) 1
= 2πσ 2
e− 2σ2
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 2/8
Symmetric, bell-shaped curve centered at μ.
3. Exponential
PDF: f (x) = λe−λx , x ≥ 0
Support: x ≥ 0, λ > 0
Time until the next event in a Poisson process.
4. Gamma
k k−1 −λx
PDF: f (x)= λ xΓ(k)e , x > 0
Support: x ≥ 0, k > 0
Sum of squares of k independent standard normal random variables.
7. Student's t
Γ( ν+1 − ν+1
2 )
(1 + x2
)
2
PDF: f (x) =
νπ Γ( ν2 ) ν
PDF: f (x) = 2
Γ(
)Γ( )d1 d2
( d x
1+ d1 )
2
2
2
Support: x ≥ 0, d1 , d2 > 0
x 2πσ
γ
These distributions form the foundation for a wide variety of statistical methods, including hypothesis
testing, Bayesian modeling, and regression analysis. Would you like help visualizing any of these or
deriving properties like mean, variance, or moments?
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 3/8
I need help in linking the functions conceptually so that they are easy to apply and remember
To link and remember these probability functions conceptually, focus on grouping them by their
relationships and contexts of use, then build a story around them. Here’s how you can do it:
Uniform (Continuous): Equal likelihood over an interval. Think of rolling a fair die where the
outcomes are decimal values.
Exponential: Time until the first event occurs (e.g., time between bus arrivals).
Link to Poisson: Exponential gives the waiting time between events in a Poisson process.
Key Connection: Poisson counts events, Exponential gives the waiting time between them.
Gamma: Sum of k Exponential random variables.
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 4/8
Normal (Gaussian): Bell-shaped, symmetric, and arises naturally due to the Central Limit
Theorem.
Key Story: Heights, weights, and measurement errors follow the normal distribution because
they result from summing many small effects.
Log-Normal: If something grows multiplicatively (e.g., stock prices), the log of the quantity is
normally distributed.
Conceptual Link: Normal → Linear additive effects; Log-Normal → Multiplicative effects.
Cauchy: Extremely heavy tails, meaning it’s dominated by extreme values (no finite
mean/variance).
Story: A noisy signal where large deviations are common.
Key Connection: Cauchy arises as the ratio of two independent standard normal variables.
Key Insight: t and F are both built from the Chi-Square distribution.
7. Constraining Probabilities
Conceptual Summary
Would you like help building visual aids, flowcharts, or practice problems to solidify these links?
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 5/8
Practice problems
Here are practice problems organized by conceptual groups to help reinforce the connections between
the distributions:
1. Bernoulli:
A coin has a 70% chance of landing heads. Let X be the outcome of one flip (X = 1 for
heads, X = 0 for tails).
(a) What is P (X = 1)?
(b) What is E(X) and Var(X)?
2. Binomial:
A die is rolled 10 times. Let X be the number of times a 6 appears.
(a) What is the PMF of X ?
(b) What is P (X = 3)?
(c) Compute E(X) and Var(X).
3. Geometric:
A basketball player makes a free throw with a probability of 0.75. Let X represent the number
of attempts until her first success.
(a) What is the PMF of X ?
(b) What is P (X = 4)?
(c) Compute E(X) and Var(X).
4. Negative Binomial:
A gambler plays a game with a 20% chance of winning. Let X be the number of games
played until the gambler wins 5 times.
(a) Write the PMF of X .
(b) What is P (X = 10)?
(c) Compute E(X) and Var(X).
5. Poisson:
A call center receives an average of 3 calls per minute. Let X be the number of calls in a
randomly selected minute.
(a) What is the PMF of X ?
(b) What is P (X = 5)?
(c) Compute E(X) and Var(X).
6. Exponential:
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 6/8
In the same call center, the time between calls follows an exponential distribution. Let T
represent the time (in minutes) between two calls, with rate λ = 3.
(a) Write the PDF of T .
(b) What is P (T
> 1)?
(c) Compute E(T ) and Var(T ).
7. Normal:
Heights of a population are normally distributed with μ = 170 cm and σ = 10 cm.
(a) What is the PDF of this distribution?
(b) What is P (160 ≤ X ≤ 180)?
X−μ
(c) If we standardize X , what is the new distribution of Z = σ
?
8. Log-Normal:
The lifetime of a device follows a log-normal distribution with parameters μ = 2 and σ 2 =
0.5.
(a) Write the PDF of the log-normal distribution.
(b) Compute E(X) in terms of μ and σ .
(c) What is the 95th percentile of X ?
9. Cauchy:
Let X follow a standard Cauchy distribution (x0 = 0, γ = 1).
(a) Write the PDF of X .
(b) What is P (−1 ≤ X ≤ 1)?
(c) Discuss why the mean and variance of X do not exist.
10. Chi-Square:
Let X ∼ χ25 (5 degrees of freedom).
A dataset of 10 observations is sampled from a normal distribution, and the sample variance
is used to compute a t-statistic. Let T ∼ t9 .
12. F -Distribution:
Two independent samples, each with variances σ12 and σ22 , are drawn. The ratio of their
13. Beta:
Let X ∼ Beta(2, 5).
(a) Write the PDF of X .
(b) Compute E(X) and Var(X).
(c) What is P (X ≤ 0.3)?
14. Gamma:
A system has failures occurring according to a Poisson process, and the time until the 3rd
failure follows a Gamma distribution. Let X ∼ Gamma(k = 3, λ = 2).
(a) Write the PDF of X .
(b) Compute E(X) and Var(X).
(c) What is P (X > 2)?
Let me know if you need solutions, derivations, or hints for any of these!
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 8/8