0% found this document useful (0 votes)
28 views7 pages

Important PMFs and PDFs

The document provides an overview of essential probability mass functions (PMFs) and probability density functions (PDFs) used in graduate-level statistics, including Bernoulli, Binomial, Poisson, Normal, and others. It categorizes these functions into discrete and continuous distributions, explaining their relationships and contexts of use, such as waiting times and event counts. Additionally, the document includes practice problems organized by conceptual groups to reinforce understanding of these distributions.

Uploaded by

Vasu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views7 pages

Important PMFs and PDFs

The document provides an overview of essential probability mass functions (PMFs) and probability density functions (PDFs) used in graduate-level statistics, including Bernoulli, Binomial, Poisson, Normal, and others. It categorizes these functions into discrete and continuous distributions, explaining their relationships and contexts of use, such as waiting times and event counts. Additionally, the document includes practice problems organized by conceptual groups to reinforce understanding of these distributions.

Uploaded by

Vasu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Help me remember all the important probability mass/density functions which are used to compute

majority of probability distributions (required at graduate level statistics)

Here's a summary of the most important probability mass functions (PMFs) and probability density
functions (PDFs) commonly used in graduate-level statistics:

Discrete Distributions (PMFs)

1. Bernoulli
PMF: P (X = x) = px (1 − p)1−x , x ∈ {0, 1}
Support: x ∈ {0, 1}, 0 ≤ p ≤ 1
Single trial with success probability p.
2. Binomial
n
PMF: P (X = x) = (x)px (1 − p)n−x ,
​ x ∈ {0, 1, … , n}
Support: x = 0, 1, … , n; 0 ≤ p ≤ 1
n independent Bernoulli trials.
3. Geometric
PMF: P (X = x) = (1 − p)x−1 p, x ≥ 1
Support: x = 1, 2, 3, …, 0 ≤ p ≤ 1
Number of trials until the first success.
4. Negative Binomial
PMF: P (X = x) = (x−1r−1 )p (1 − p)
r

x−r
,x≥r
Support: x = r, r + 1, …, 0 ≤ p ≤ 1
Number of trials until r -th success.
5. Poisson
λx e−λ
PMF: P (X = x) = x! , x ≥ 0 ​

Support: x = 0, 1, 2, …, λ > 0
Number of events in a fixed interval.

Continuous Distributions (PDFs)

1. Uniform (Continuous)
1
PDF: f (x)= b−a ,a≤x≤b

Support: a ≤ x ≤ b
Equal probability over [a, b].
2. Normal (Gaussian)
(x−μ)2
PDF: f (x) 1
= 2πσ 2
e− 2σ2

Support: x ∈ (−∞, ∞), μ ∈ R, σ > 0


Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 2/8
Symmetric, bell-shaped curve centered at μ.
3. Exponential
PDF: f (x) = λe−λx , x ≥ 0
Support: x ≥ 0, λ > 0
Time until the next event in a Poisson process.
4. Gamma
k k−1 −λx
PDF: f (x)= λ xΓ(k)e , x > 0 ​

Support: x > 0, k > 0, λ > 0


Generalization of the exponential; used in Bayesian inference.
5. Beta
α−1 β−1
(1−x)
PDF: f (x)= x B(α,β) ,0<x<1 ​

Support: x ∈ (0, 1), α, β > 0


Probability distribution of probabilities.
6. Chi-Square
1
PDF: f (x)
= 2k/2 Γ(k/2) xk/2−1 e−x/2 , x ≥ 0 ​

Support: x ≥ 0, k > 0
Sum of squares of k independent standard normal random variables.
7. Student's t

Γ( ν+1 − ν+1
2 )
(1 + x2
)
2

PDF: f (x) =

νπ Γ( ν2 ) ν
​ ​

​ ​

Support: x ∈ (−∞, ∞), ν > 0


Distribution of a standardized mean with unknown variance.
8. F-distribution
Γ(
d1 +d2 d1 /2
)
( dd12 ) xd1 /2−1
​ ​

PDF: f (x) = 2

Γ(
)Γ( )d1 d2 ​ ​

(d1 +d2 )/2 ​

( d x
1+ d1 )
​ ​
​ ​

​ ​

2 ​

2 ​

2

Support: x ≥ 0, d1 , d2 > 0 ​ ​

Ratio of two scaled chi-square distributions.


9. Log-Normal
(ln x−μ)2
PDF: f (x)= 1 2 e− 2σ2 , x > 0 ​

x 2πσ

Support: x > 0, μ ∈ R, σ > 0


Model for multiplicative processes.


10. Cauchy
1
PDF: f (x) = 2 , x ∈ (−∞, ∞)
πγ [1+(
)] x−x0

γ

Support: x ∈ (−∞, ∞), x0 ∈ R, γ > 0 ​

Heavy-tailed distribution without a finite mean or variance.

These distributions form the foundation for a wide variety of statistical methods, including hypothesis
testing, Bayesian modeling, and regression analysis. Would you like help visualizing any of these or
deriving properties like mean, variance, or moments?

Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 3/8
I need help in linking the functions conceptually so that they are easy to apply and remember

To link and remember these probability functions conceptually, focus on grouping them by their
relationships and contexts of use, then build a story around them. Here’s how you can do it:

1. Start with the Foundation: The Building Blocks

Bernoulli: A single success-or-failure trial. Think of flipping a coin.

Binomial: A series of independent Bernoulli trials. The number of successes in n trials.

Conceptual Link: Binomial extends Bernoulli by repeating it multiple times.


Geometric: "How many trials until the first success?" (waiting for success).
Negative Binomial: "How many trials until r -th success?" (generalized waiting).
Key Connection: All of these involve success-failure trials governed by the probability p.

2. Discrete Events in Time or Space

Poisson: Counts events in a fixed interval (time/space).


Story: Imagine customers arriving at a store or radioactive particles decaying.
Link to Binomial: Poisson arises as a limit of Binomial when n → ∞ and p → 0, keeping
np = λ constant.
Key Insight: Poisson is for rare events over time/space, while Binomial is for fixed n.

3. Continuous Distributions: Smooth Counterparts

Uniform (Continuous): Equal likelihood over an interval. Think of rolling a fair die where the
outcomes are decimal values.
Exponential: Time until the first event occurs (e.g., time between bus arrivals).
Link to Poisson: Exponential gives the waiting time between events in a Poisson process.
Key Connection: Poisson counts events, Exponential gives the waiting time between them.
Gamma: Sum of k Exponential random variables.

Story: Time until the k -th event occurs in a Poisson process.


Conceptual Hierarchy:
Uniform → Base of many models.
Exponential → Waiting times.
Gamma → Generalized waiting for multiple events.

4. Symmetry and Natural Phenomena

Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 4/8
Normal (Gaussian): Bell-shaped, symmetric, and arises naturally due to the Central Limit
Theorem.
Key Story: Heights, weights, and measurement errors follow the normal distribution because
they result from summing many small effects.
Log-Normal: If something grows multiplicatively (e.g., stock prices), the log of the quantity is
normally distributed.
Conceptual Link: Normal → Linear additive effects; Log-Normal → Multiplicative effects.

5. Heavy-Tails and Outliers

Cauchy: Extremely heavy tails, meaning it’s dominated by extreme values (no finite
mean/variance).
Story: A noisy signal where large deviations are common.
Key Connection: Cauchy arises as the ratio of two independent standard normal variables.

6. Relationships from Ratios

Chi-Square: Sum of squares of standard normal variables. Used in variance estimation.


Link: A special case of Gamma with k = degrees of freedom/2.
t-Distribution: Ratio of a standard normal to the square root of a scaled chi-square. Used when
variance is estimated.
Story: t bridges the gap between Normal (known variance) and small-sample inferences
(unknown variance).
F-Distribution: Ratio of two independent scaled chi-squares. Used in comparing variances (e.g.,
ANOVA).

Key Insight: t and F are both built from the Chi-Square distribution.

7. Constraining Probabilities

Beta: Distributions of probabilities themselves (bounded between 0 and 1).


Story: Models prior knowledge of success probabilities in Bayesian statistics.
Connection to Binomial: The Beta distribution is the conjugate prior for the Binomial.

Conceptual Summary

1. Bernoulli and Binomial: Success-failure. (Trials)


2. Geometric and Negative Binomial: Waiting for success.
3. Poisson and Exponential: Rare events in time/space (counts and waiting times).
4. Gamma: Generalized waiting for k events.
5. Normal and Log-Normal: Summation and multiplication effects.
6. Chi-Square, t, and F : Variance and ratio-based inferences.
7. Beta: Modeling probabilities.

Would you like help building visual aids, flowcharts, or practice problems to solidify these links?

Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 5/8
Practice problems

Here are practice problems organized by conceptual groups to help reinforce the connections between
the distributions:

Group 1: Bernoulli, Binomial, Geometric, and Negative Binomial

1. Bernoulli:
A coin has a 70% chance of landing heads. Let X be the outcome of one flip (X = 1 for
heads, X = 0 for tails).
(a) What is P (X = 1)?
(b) What is E(X) and Var(X)?
2. Binomial:
A die is rolled 10 times. Let X be the number of times a 6 appears.
(a) What is the PMF of X ?
(b) What is P (X = 3)?
(c) Compute E(X) and Var(X).
3. Geometric:
A basketball player makes a free throw with a probability of 0.75. Let X represent the number
of attempts until her first success.
(a) What is the PMF of X ?
(b) What is P (X = 4)?
(c) Compute E(X) and Var(X).
4. Negative Binomial:
A gambler plays a game with a 20% chance of winning. Let X be the number of games
played until the gambler wins 5 times.
(a) Write the PMF of X .
(b) What is P (X = 10)?
(c) Compute E(X) and Var(X).

Group 2: Poisson and Exponential

5. Poisson:
A call center receives an average of 3 calls per minute. Let X be the number of calls in a
randomly selected minute.
(a) What is the PMF of X ?
(b) What is P (X = 5)?
(c) Compute E(X) and Var(X).
6. Exponential:

Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 6/8
In the same call center, the time between calls follows an exponential distribution. Let T
represent the time (in minutes) between two calls, with rate λ = 3.
(a) Write the PDF of T .
(b) What is P (T
> 1)?
(c) Compute E(T ) and Var(T ).

Group 3: Normal, Log-Normal, and Cauchy

7. Normal:
Heights of a population are normally distributed with μ = 170 cm and σ = 10 cm.
(a) What is the PDF of this distribution?
(b) What is P (160 ≤ X ≤ 180)?
X−μ
(c) If we standardize X , what is the new distribution of Z = σ
​?
8. Log-Normal:
The lifetime of a device follows a log-normal distribution with parameters μ = 2 and σ 2 =
0.5.
(a) Write the PDF of the log-normal distribution.
(b) Compute E(X) in terms of μ and σ .
(c) What is the 95th percentile of X ?
9. Cauchy:
Let X follow a standard Cauchy distribution (x0 ​ = 0, γ = 1).
(a) Write the PDF of X .
(b) What is P (−1 ≤ X ≤ 1)?
(c) Discuss why the mean and variance of X do not exist.

Group 4: Chi-Square, t-Distribution, and F -Distribution

10. Chi-Square:
Let X ∼ χ25 (5 degrees of freedom).

(a) Write the PDF of X .


(b) What is P (X > 10)?
(c) Compute E(X) and Var(X).
11. t-Distribution:

A dataset of 10 observations is sampled from a normal distribution, and the sample variance
is used to compute a t-statistic. Let T ∼ t9 .

(a) Write the PDF of T .


(b) What is P (T > 2)?
(c) Compare the shape of t9 to the standard normal distribution.

12. F -Distribution:

Two independent samples, each with variances σ12 and σ22 , are drawn. The ratio of their

sample variances follows an F -distribution. Let X ∼ F5,10 . ​

(a) Write the PDF of X .


(b) What is P (X
> 3)?
(c) Compute E(X) (if it exists).
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 7/8
Group 5: Beta and Gamma

13. Beta:
Let X ∼ Beta(2, 5).
(a) Write the PDF of X .
(b) Compute E(X) and Var(X).
(c) What is P (X ≤ 0.3)?
14. Gamma:
A system has failures occurring according to a Poisson process, and the time until the 3rd
failure follows a Gamma distribution. Let X ∼ Gamma(k = 3, λ = 2).
(a) Write the PDF of X .
(b) Compute E(X) and Var(X).
(c) What is P (X > 2)?

Let me know if you need solutions, derivations, or hints for any of these!

Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 8/8

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy