Advanced Topic1
Advanced Topic1
REVIEW
• The sample space S of an experiment (whose outcome is uncertain) is
the set of all possible outcomes of the experiment.
• Any subset E of the sample space S is known as an event; i.e. an event
is a set consisting of possible outcomes of the experiment.
• Given events E and F, E ∪ F is the set of all outcomes either in E or F
or in both E and F. E ∪ F occurs if either E or F occurs. E ∪ F is the
union of events E and F.
• If 𝐸𝑖 𝑖≥1 are events then the union is denoted ∪∞ 𝑖=1 𝐸𝑖 it is the event
which consists of all the outcomes in 𝐸𝑖 𝑖≥1
REVIEW
• Given events E and F, E ∩ F is the set of all outcomes which are both
in E and F. E ∩ F is also denoted EF and is called the intersection of E
and F
• If 𝐸𝑖 𝑖≥1 are events then the intersection is denoted ∩∞ 𝑖=1 𝐸𝑖 : it is the
event which consists of the outcomes which are in all of the events
𝐸𝑖 𝑖≥1 .
• Given 2 events E and F. If EF = ∅, then E and F are said to be mutually
exclusive.
REVIEW
• For any event E, we define the new event 𝐸 𝑐 (or ~E or 𝐸), referred to
as the complement of E, to consist of all outcomes in the sample
space S that are not in E. Hence we have
𝐸 ∪ 𝐸 𝑐 = S; E ∩ 𝐸 𝑐 = ∅
• For any two events E and F, we write E ⊂ F if all the outcomes of E are
in F.
REVIEW
• We say that a collection of events forms exhaustive outcomes, or that
this collection forms a partition of the probability space, if their
union is the entire probability space, and they are mutually exclusive.
• A1 , A2 ,…, An are called exhaustive events if
𝐴1 ∪ 𝐴2 ∪. . .∪ 𝐴𝑛 = 𝑆
𝐴𝑖 ∩ 𝐴𝑗 = ∅, 𝑖 ≠ 𝑗
REVIEW
(∪∞ 𝐸 ) 𝑐 =∩∞ 𝐸 𝑐
𝑖=1 𝑖 𝑖=1 𝑖
𝑐
(∩∞ 𝐸
𝑖=1 𝑖 ) 𝑐
=∪ ∞
𝐸
𝑖=1 𝑖
REVIEW
• Probabiliy is a function that assigns a number between 0 and 1 to
each event (0 ≤ P(E) ≤ 1), with the following defining properties
•P S =1
• P(∅) = 0
• For any sequence of mutually exclusive events Ei i≥1 , i.e Ei ∩ Ej =
∅ when i ≠ j, we have
∞
𝑃 ∪∞
𝑖=1 𝐸𝑖 = 𝑃(𝐸𝑖 )
𝑖=1
REVIEW
• 𝑃(𝐸 𝑐 ) = 1 − 𝑃(𝐸)
• If 𝐸 ⊂ 𝐹 then 𝑃 (𝐸) ≤ 𝑃(𝐹)
• 𝑃(𝐸 ∪ 𝐹) = 𝑃 (𝐸) + 𝑃 (𝐹) − 𝑃(𝐸 ∩ 𝐹)
• 𝑃(𝐸 ∪ 𝐹 ∪ 𝐺) = 𝑃 𝐸 + 𝑃 𝐹 + 𝑃 𝐺
− 𝑃(𝐸 ∩ 𝐹) − 𝑃 𝐸 ∩ 𝐺 − 𝑃 𝐹 ∩ 𝐺
+ 𝑃(𝐸 ∩ 𝐹 ∩ 𝐺)
EX1.1
EX1.2
EX1.3
EX1.3
EX1.4
EX1.5
EX1.6
EX1.7
REVIEW
• When the sample spaces has equally likely outcomes
for any event E,
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑖𝑛 𝐸 𝑛(𝐸)
𝑃 𝐸 = =
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑖𝑛 𝑆 𝑛(𝑆)
EX1.8
EX1.9
REVIEW
Conditional Probabilities: Consider an experiment with sample space 𝑆.
Let 𝐸 and 𝐹 be two events, then the conditional probability of 𝐸 given
𝐹 is denoted by 𝑃 (𝐸|𝐹) and satisfies if 𝑃 (𝐹) > 0
𝑃(𝐸 ∩ 𝐹)
𝑃 𝐸𝐹 =
𝑃(𝐹)
EX1.10
EX1.11
REVIEW
• Multiplicative rule: Let 𝐸1, 𝐸2, . . . , 𝐸𝑛 be a sequence of events, then we
have
𝑃 𝐸1 ∩ 𝐸2 ∩ ⋯ ∩ 𝐸𝑛
= 𝑃 𝐸1 . 𝑃 𝐸2 𝐸1 . 𝑃(𝐸3 |𝐸1 ∩ 𝐸2 ) … 𝑃(𝐸𝑛 |𝐸1 ∩ 𝐸2 ∩…∩ 𝐸𝑛−1 )
EX1.12
REVIEW
We say that two events 𝐸, 𝐹 are independent if
𝑃(𝐸 ∩ 𝐹) = 𝑃(𝐸) 𝑃(𝐹)
or equivalently
𝑃(𝐸|𝐹) = 𝑃(𝐸) and 𝑃 (𝐹|𝐸) = 𝑃(𝐹)
𝑃 𝐸 = 𝑃 𝐹 . 𝑃 𝐸 𝐹 + 𝑃 𝐹𝑐 . 𝑃 𝐸 𝐹𝑐
In general,
Assume ∪𝑛𝑖=1 𝐹𝑖 = 𝑆; 𝐹𝑖 ∩ 𝐹𝑗 = ∅, ∀𝑖 ≠ 𝑗, then
𝑛
𝑃 𝐸 = 𝑖=1 𝑃 𝐹𝑖 . 𝑃 𝐸 𝐹𝑖
EX1.15
REVIEW
Bayes’ Formula
𝑃 𝐹𝑗 . 𝑃 𝐸 𝐹𝑗
𝑃 𝐹𝑗 |𝐸 = 𝑛
𝑖=1 𝑃 𝐹𝑖 . 𝑃 𝐸 𝐹𝑖
EX1.16
EX1.17
EX1.18
EX1.19
EX1.20
EX1.21
EX1.22B
EX1.23D
EX1.24C
EX1.25A
EX1.26C
EX1.27E
EX1.28C
EX1.29C
EX1.30E
EX1.31B
EX1.32C
EX1.33B
EX1.34B
EX1.35B
HW1.1C
HW1.2B
HW1.3D
HW1.4E
HW1.5E
HW1.6D
HW1.7C
HW1.8D
HW1.9B
HW1.10C
HW1.11D
HW1.12D
HW1.13D
HW1.14C
HW1.15C