T2 QA 1 Probability v3.3 Sample
T2 QA 1 Probability v3.3 Sample
Quantitative Analysis
2
Chapter 1: Fundamentals of Probability
Describe an event and an event space.
Describe independent events and mutually exclusive events.
Explain the difference between independent events and conditionally independent
events.
Calculate the probability of an event given a discrete probability function.
Define and calculate a conditional probability
Distinguish between conditional and unconditional probabilities.
Explain and apply Bayes’ rule.
3
Describe an event and an event space.
The sample space, denoted Ω, is the set of all possible outcomes. Its contents depend on the
situation and might include nominal, ordinal (aka, categorical), interval or ratio data types. If the
sample set includes real numbers, it will be denoted by ℝ, as would be probably the case for
asset returns.
Events, denoted by ω, are subsets of the sample space. An event is a set of outcomes (but an
event can contain zero elements). An elementary event includes only one outcome. The event
space, denoted by ℱ, includes events (aka, outcome combinations) to which we can assign
probabilities. For example, imagine two bonds where each bond eithers default or survive. We
can observe four events.
{A defaults but B survives}, {B defaults, but A survives},
{Both A and B default}, or {Both A and B survive; aka, Neither A nor B defaults}.
This event space contains four events, which is a finite number and can be thusly be called a
discrete probability space. Alternatively, we can define the even {only one bond defaults} such
that this alternative event space contains only three events: {none default, only one defaults, or
both default} but the total probability will still sum to 100.0% as the probability of {only one
default} would equal the sum of {A defaults, but B survives} and {B defaults, but A survives}
If two events are mutually exclusive, the probability of either occurring is the sum of their
individual probabilities. Statistically:
[ ∪ ] = [ ] + [ ] if mutally exclusive
This equality is true only for mutually exclusive events. This property of mutually exclusive
events can be extended to any number of events. The probability that any of n mutually
exclusive events occurs is the sum of the probabilities of (each of) those n events.
The random variables X and Y are independent if the conditional distribution of Y given X equals
the marginal distribution of Y. Since independence implies P (Y=y | X=x) = P(Y=y):
P(X = x, Y = y)
P( = | = )=
P (X = x)
Statistical independence is when the value taken by one variable has no effect on the value
taken by the other variable. If the variables are independent, their joint probability will equal
the product of their marginal probabilities. If they are not independent, they are dependent.
1Michael Miller, Mathematics and Statistics for Financial Risk Management, 2nd Edition (Hoboken, NJ:
John Wiley & Sons, 2013)
4
The most useful test of statistical independence is given by:
That is, random variables X and Y are independent if their joint distribution is equal to the
product of their marginal distributions. For example, when rolling two dice, the outcome of the
second one will be independent of the first. This independence implies that the probability of
rolling double-sixes is equal to the product of P(rolling one six) multiplied by P(rolling one six).
So, if two die are independent, then:
P (first roll = 6, second roll = 6) = P (rolling a six) × P (rolling a six) = (1/6) × (1/6) = 1/36
If the probability of an event does not depend on another event, the events are independent. If
two events are independent, then their joint probability is equal to the product of their
unconditional probabilities:
If events are independent, then conditional probability is equal to the unconditional probability:
P ( ∩ B) P( ) P ( )
P( | ) = = = P( )
P( ) P( )
By the same logic, if the events are independent, then also true is that P(A|B) = P(A).
Mutually exclusive events cannot both occur such that P(A∩B) = 0. This implies that mutually
exclusive events cannot be independent; i.e., mutually exclusive events are dependent.
Conditional Independence
Independence can be conditional on another event. A and B are conditionally independent if:
P(A ∩ B |C ) denotes the conditional probability that A and B jointly occur conditional on
outcome C. Events can be conditionally independent yet unconditionally dependent. Events
can be conditionally dependent, yet independent!
5
Calculate the probability of an event for a discrete probability
function.
Probability: Classical or “a priori” definition
( )=
For example, consider a craps roll of two six-sided dice. What is the probability of rolling a
seven? i.e., P[X=7]. There are six outcomes that generate a roll of seven: 1+6, 2+5, 3+4, 4+3,
5+2, and 6+1. Further, there are 36 total outcomes. Therefore, the probability is 6/36 =1/6.
In this case, the outcomes need to be mutually exclusive, equally likely, and “cumulatively
exhaustive” (i.e., all possible outcomes included in total). A key property of a probability is that
the sum of the probabilities for all (discrete) outcomes is 1.0.
Relative frequency is based on an actual number of historical observations (or Monte Carlo
simulations). For example, here is a simulation (produced in Excel) of one hundred (100) rolls of
a single six-sided die:
This relates also to sampling variation. The a priori probability is based on population properties;
in this case, the a priori probability of rolling any number is clearly 1/6th. However, a sample of
100 trials will exhibit sampling variation: the number of threes (3s) rolled above varies from the
parametric probability of 1/6th. We do not expect the sample to produce 1/6th perfectly for each
outcome.