0% found this document useful (0 votes)
24 views47 pages

SRDS Lecture 4 Theory of Probability

SRDC notes4
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views47 pages

SRDS Lecture 4 Theory of Probability

SRDC notes4
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Seismic Resistant Design of

Structures
Theory of Probability
Lecture 4

M.Sc. in Earthquake Engineering


Institute of Engineering
Contents
1. Introduction
2. Simple Definitions
3. Types of Probability
4. Theorems of Probability
5. Probabilities under conditions of statistically
independent events
6. Probabilities under conditions of statistically
dependent events
7. Bayes Theorem
8. Glossary of Terms

2
Introduction
• If an experiment is repeated under essentially
homogeneous & similar conditions we generally
come across 2 types of situations:

– Deterministic/ Predictable: - The result of what


is usually known as the ‘outcome’ is unique or
certain.

• Example:- The velocity ‘v’ of a particle after


time ‘t’ is given by
v = u + at
Equation uniquely determines v if the right-hand
quantities are known.

3
– Unpredictable/ Probabilistic: - The result is not
unique but may be one of the several possible
outcomes.

• Examples: -

(i) In tossing of a coin one is not sure if a


head or a tail will be obtained.

(ii) If a light tube has lasted for t hours, nothing


can be said about its further life. It
may fail to
function any moment.

4
Simple Definitions
• Trial & Event

– Example: - Consider an experiment which, though


repeated under essentially identical conditions,
does not give unique results but may result in any
one of the several possible outcomes.

– Experiment is known as a Trial & the outcomes


are known as Events or Cases.
• Throwing a die is a Trial & getting 1 (2,3,…,6)
is an event.
• Tossing a coin is a Trial & getting Head (H) or
Tail (T) is an event.

5
• Exhaustive Events: - The total number of
possible outcomes in any trial.
– In tossing a coin there are 2 exhaustive cases,
head & tail.
– In throwing a die, there are 6 exhaustive cases
since any one of the 6 faces 1,2,…,6 may come
uppermost.

Experiment Collectively Exhaustive Events


In a tossing of an unbiased coin Possible solutions – Head/ Tail
Exhaustive no. of cases – 2
In a throw of an unbiased cubic Possible solutions – 1,2,3,4,5,6
die Exhaustive no. of cases – 6
In drawing a card from a well Possible solutions – Ace to King
shuffled standard pack of playing Exhaustive no. of cases – 52
cards

6
• Favorable Events/ Cases: - It is the number of
outcomes which entail the happening of an event.
– In throwing of 2 dice, the number of cases favorable to getting
the sum 5 is:
(1,4), (4,1), (2,3), (3,2).
– In drawing a card from a pack of cards the number of cases
favorable to drawing an ace is 4, for drawing a spade is 13 &
for drawing a red card is 26.

• Independent Events: - If the happening (or non-


happening) of an event is not affected by the
supplementary knowledge concerning the occurrence
of any number of the remaining events.
– In tossing an unbiased coin the event of getting a head in the
first toss is independent of getting a head in the second, third
& subsequent throws.

7
• Mutually exclusive Events: - If the happening of any
one of the event precludes the happening of all the
others.
– In tossing a coin the events head & tail are mutually
exclusive.
– In throwing a die all the 6 faces numbered 1 to 6 are
mutually exclusive since if any one of these faces comes,
the possibility of others, in the same trial, is ruled out.

Experiment Mutually Exclusive Events


In a tossing of an unbiased Head/ Tail
coin
In a throw of an unbiased Occurrence of 1 or 2 or 3 or 4 or 5 or
cubic die 6
In drawing a card from a well Card is a spade or heart
shuffled standard pack of Card is a diamond or club
playing cards Card is a king or a queen

8
• Equally likely Events: - Outcomes of a trial are said to
be equally likely if taken into consideration all the relevant
evidences, there is no reason to expect one in preference
to the others.
– In tossing an unbiased coin or uniform coin, head or tail are
equally likely events.
– In throwing an unbiased die, all the 6 faces are equally
likely to come.

Experiment Collectively Exhaustive Events


In a tossing of an unbiased Head is likely to come up as a Tail
coin
In a throw of an unbiased Any number out of 1,2,3,4,5,6 is
cubic die likely to come up
In drawing a card from a well Any card out of 52 is likely to come
shuffled standard pack of up
playing cards

9
• Probability: Probability of a given event is an
expression of likelihood of occurrence of an event.
– Probability is a number which ranges from 0 to 1.
– Zero (0) for an event which cannot occur and 1 for an
event which is certain to occur.

• Importance of the concept of Probability

– Probability models can be used for making


predictions.
– Probability theory facilitates the construction of
econometric model.
– It facilitates the managerial decisions on planning and
control.

10
Types of Probability

There are 3 approaches to probability, namely:

1. The Classical or ‘a priori’ probability


2. The Statistical or Empirical probability
3. The Axiomatic probability

11
Mathematical/ Classical/ ‘a priori’
Probability
• Basic assumption of classical approach is that the
outcomes of a random experiment are “equally
likely”.

• According to Laplace, a French Mathematician:


“Probability, is the ratio of the number of ‘favorable’ cases
to the total number of equally likely cases”.

• If the probability of occurrence of A is denoted by


p(A), then by this definition, we have:
Number of favorable cases m
p = P(E) = ------------------------------ = ----
Total number of equally likely cases n

12
• Probability ‘p’ of the happening of an event is also
known as probability of success & ‘q’ the non-
happening of the event as the probability of failure.

• If P(E) = 1, E is called a certain event &


if P(E) = 0, E is called an impossible event

• The probability of an event E is a number such that


0 ≤ P(E) ≤ 1, & the sum of the probability that an
event will occur & an event will not occur is equal to
1.
i.e., p + q = 1

13
Limitations of Classical definition
• Classical probability is often called a priori probability
because if one keeps using orderly examples of
unbiased dice, fair coin, etc. one can state the
answer in advance (a priori) without rolling a dice,
tossing a coin etc.

• Classical definition of probability is not very


satisfactory because of the following reasons:
– It fails when the number of possible outcomes of
the experiment is infinite.
– It is based on the cases which are “equally likely”
and as such cannot be applied to experiments
where the outcomes are not equally likely.
14
– It may not be possible practically to enumerate
all the possible outcomes of certain experiments
and in such cases the method fails.

• Example it is inadequate for answering


questions such as: What is the probability
that a man aged 45 will die within the next
year?

Here there are only 2 possible outcomes, the


individual will die in the ensuing year or he will
live. The chances that he will die is of course
much smaller than he will live.

How much smaller?


15
Relative/ Statistical/ Empirical
Probability
• Probability of an event is determined objectively by repetitive
empirical observations/ Experiments. Probabilities are assigned
a posterior.

• According to Von Mises “If an experiment is performed


repeatedly under essentially homogeneous conditions and
identical conditions, then the limiting value of the ratio of the
number of times the event occurs to the number of trials, as the
number of trials becomes indefinitely large, is called the
probability of happening of the event, it being assumed that the
limit is finite and unique”.
– Example: - When a coin is tossed, what is the probability that
the coin will turn heads?
• Suppose coin is tossed for 50 times & it falls head 20
times, then the ratio 20/50 is used as an estimate of the
probability of heads of this coin.

16
• Symbolically, if in n trials an event E happens m
times, then the probability ‘ p’ of the happening of E
is given by
m
p = P(E) = Lt ----
N ->  N

• In this case, as the number of trails increase


probabilities of outcomes move closer to the real
probabilities and tend to be real probabilities as the
number of trails tends to infinity (a large number).

• The empirical probability approaches the classical


probability as the number of trails becomes
indefinitely large.

17
Limitations of Statistical/ Empirical
method
• The Empirical probability P(A) defined earlier
can never be obtained in practice and we can
only attempt at a close estimate of P(A) by
making N sufficiently large.
• The experimental conditions may not remain
essentially homogeneous and identical in a
large number of repetitions of the experiment.
• The relative frequency of m/N, may not attain a
unique value, no matter however large N may
be.

18
The Axiomatic Approach
• Modern theory of probability is based on the axiomatic
approach introduced by the Russian Mathematician A. N.
Kolmogorov in 1930’s.

• Classical approach restricts the calculation of probability


to essentially equally likely & mutually exclusively events.

• Empirical approach requires that every question be


examined experimentally under identical conditions, over
a long period of time considering repeated observations.

• Axiomatic approach is largely free from the inadequacies


of both the classical & empirical approaches.

19
• Given a sample space of a random experiment, the
probability of the occurrence of any event A is
defined as a set function P(A) satisfying the
following axioms.
1. Axiom 1: - P(A) is defined, is real and non-
negative i.e.,
P(A) ≥ 0 (Axiom of non-negativity)
2. Axiom 2: - P(S) = 1 (Axiom of certainty)
3. Axiom 3: - If A1, A2, …., An is any finite or infinite
sequence of disjoint events of S, then

n n
P ( U Ai) = ∑ P( Ai )
i=1 i=1

20
The Objective and Subjective Approach

• Objective approach to probability is arrived on


opinion basis or an empirical basis.
– It is given by the ratio of frequency of an outcome to
the total number of possible outcomes.

• Subjective approach to probability is not


concerned with the relative or expected
frequency of an outcome.
– It is concerned with the strength of a decision makers
belief that an outcome will not occur.
– It is particularly oriented towards decision-making
situations.
21
Theorems of Probability
• There are 2 important theorems of
probability which are as follows:

– The Addition Theorem and


– The Multiplication Theorem

22
Addition theorem when events are
Mutually Exclusive
• Definition: - It states that if 2 events A and B are
mutually exclusive then the probability of the
occurrence of either A or B is the sum of the
individual probability of A and B.
• Symbolically

P(A or B) or P(A U B) = P(A) + P(B)

• The theorem can be extended to three or more


mutually exclusive events. Thus,
P(A or B or C) = P(A) + P(B) + P(C)
23
Addition theorem when events are not
Mutually Exclusive (Overlapping or
Intersection Events)
• Definition: - It states that if 2 events A and B are
not mutually exclusive then the probability of the
occurrence of either A or B is the sum of the
individual probability of A and B minus the
probability of occurrence of both A and B.
• Symbolically

P(A or B) or P(A U B) = P(A) + P(B) – P(A ∩ B)

24
Multiplication theorem
• Definition: States that if 2 events A and B are
independent, then the probability of the occurrence of
both of them (A & B) is the product of the individual
probability of A and B.
• Symbolically,
Probability of happening of both the events:
P(A and B) or P(A ∩ B) = P(A) x P(B)

• Theorem can be extended to 3 or more independent


events. Thus,
P(A, B and C) or P(A ∩ B ∩ C) = P(A) x P(B) x P(C)

25
Probability Rules

26
Probabilities under conditions of
Statistical Independence
• Statistically Independent Events: - The
occurrence of one event has no effect on the
probability of the occurrence of any other event.
• Most managers who use probabilities are
concerned with 2 conditions.
1.The case where one event or another will
occur.
2.The situation where 2 or more events will both
occur.

27
• There are 3 types of probabilities under statistical
independence.
– Marginal
– Joint
– Conditional

• Marginal/ Unconditional Probability: - A single


probability where only one event can take place.

• Joint probability: - Probability of 2 or more events


occurring together or in succession.

• Conditional probability: - Probability that a second


event (B) will occur if a first event (A) has already
happened.
28
Example: Marginal Probability - Statistical
Independence
• A single probability where only one event can take place.

Marginal Probability of an Event


P(A) = P(A)

• Example 1: - On each individual toss of an biased or unfair


coin, P(H) = 0.90 & P(T) = 0.10. The outcomes of several
tosses of this coin are statistically independent events too,
even tough the coin is biased.

• Example 2: - 50 students of a school drew lottery to see which


student would get a free trip to the Carnival at Goa. Any one
of the students can calculate his/ her chances of winning as:
P(Winning) = 1/50 = 0.02

29
Example: Joint Probability - Statistical
Independence
• The probability of 2 or more independent events occurring
together or in succession is the product of their marginal
probabilities.

Joint Probability of 2 Independent Events


P(AB) = P(A) * P(B)

• Example: - What is the probability of heads on 2 successive


tosses?
P(H1H2) = P(H1) * P(H2)
= 0.5 * 0.5 = 0.25
The probability of heads on 2 successive tosses is 0.25, since
the probability of any outcome is not affected by any
preceding outcome.
30
• We can make the probabilities of events even more explicit using a Probabilistic
Tree.

1 Toss 2 Toss 3 Toss

H1 0.5 H 1H 2 0.25 H 1H 2H 3 0.125


T1 0.5 H1T2 0.25 H1H2T3 0.125
T1H2 0.25 H1T2H3 0.125
T1T2 0.25 H1T2T3 0.125
T1H2H3 0.125
T1H2T3 0.125
T1T2H3 0.125
T1T2T3 0.125

31
Example: Conditional Probability - Statistical
Independence

• For statistically independent events, conditional probability of


event B given that event A has occurred is simply the
probability of event B.

Conditional Probability for 2 Independent Events


P(B|A) = P(B)

• Example: - What is the probability that the second toss of a


fair coin will result in heads, given that heads resulted on the
first toss?
P(H2|H1) = 0.5
For 2 independent events, the result of the first toss have
absolutely no effect on the results of the second toss.

32
Probabilities under conditions of
Statistical Dependence
• Statistical Dependence exists when the
probability of some event is dependent on or
affected by the occurrence of some other
event.

• The types of probabilities under statistical


dependence are:
• Marginal
• Joint
• Conditional
33
Example
• Assume that a box contains 10 balls distributed as follows: -
Event Probability of Event
• 3 are colored & dotted
1 0.1
• 1 is colored & striped Colored & Dotted
2 0.1
• 2 are gray & dotted
3 0.1
• 4 are gray & striped
4 0.1 Colored & Striped
5 0.1
6 0.1 Gray & Dotted

7 0.1
8 0.1
Gray & Striped
9 0.1
10 0.1

34
Example: Marginal Probability - Statistically
Dependent
• It can be computed by summing up all the joint events in
which the simple event occurs.

• Compute the marginal probability of the event colored.

It can be computed by summing up the probabilities of


the two joint events in which colored occurred:

P(C) = P(CD) + P(CS)


= 0.3 + 0.1
=0.4

35
Example: Joint Probability - Statistically Dependent

• Joint probabilities under conditions of statistical


dependence is given by
Joint probability for Statistically Dependent Events
P(BA) = P(B|A) * P(A)

• What is the probability that this ball is dotted and


colored?
Probability of colored & dotted balls =
P(DC) = P(D|C) * P(D)
= (0.3/0.4) * 0.5
= 0.3 (Approximately)

36
Example: Conditional Probability - Statistically
Dependent
• Given A & B to be the 2 events then,
Conditional probability for Statistically Dependent Events
P(BA)
P(B|A) = ----------
P(A)

• What is the probability that this ball is dotted, given that it


is colored?

The probability of drawing any one of the ball from this box is 0.1
(1/10) [Total no. of balls in the box = 10].

37
We know that there are 4 colored balls, 3 of which
are dotted & one of it striped.
P(DC) 0.3
P(D|C) = --------- = ------
P(C) 0.4
= 0.75

P(DC) = Probability of colored & dotted balls


(3 out of 10 --- 3/10)
P(C) = 4 out of 10 --- 4/10

38
Revising Prior Estimates of
Probabilities: Bayes’ Theorem

• A very important & useful application of conditional


probability is the computation of unknown probabilities,
based on past data or information.

• When an event occurs through one of the various


mutually disjoint events, then the conditional probability
that this event has occurred due to a particular reason or
event is termed as Inverse Probability or Posterior
Probability.

• Has wide ranging applications in Business & its


Management.

39
• Since it is a concept of revision of probability based on
some additional information, it shows the improvement
towards certainty level of the event.

• Example 1: - If a manager of a boutique finds that most


of the purple & white jackets that she thought would sell
so well are hanging on the rack, she must revise her
prior probabilities & order a different color combination or
have a sale.

• Certain probabilities were altered after the people got


additional information. New probabilities are known as
revised, or Posterior probabilities.

40
Bayes Theorem
• If an event A can occur only in conjunction with n
mutually exclusive & exhaustive events B1, B2, …, Bn, &
if A actually happens, then the probability that it was
preceded by an event Bi (for a conditional probabilities of
A given B1, A given B2 … A given Bn are known) & if
marginal probabilities P(Bi) are also known, then the
posterior probability of event Bi given that event A has
occurred is given by:

P(A | Bi). P(Bi)


P(Bi | A) = ----------------------
∑ P(A | Bi). P(Bi)

41
Remarks: -

– The probabilities P(B1), P(B2), … , P(Bn) are termed


as the ‘a priori probabilities’ because they exist before
we gain any information from the experiment itself.

– The probabilities P(A | Bi), i=1,2,…,n are called


‘Likelihoods’ because they indicate how likely the
event A under consideration is to occur, given each &
every a priori probability.

– The probabilities P(Bi | A), i=1, 2, …,n are called


‘Posterior probabilities’ because they are determined
after the results of the experiment are known.

42
Problem
• In a bolt factory machines A, B, & C manufacture
respectively 25%, 35%, & 40% of the total. Of
their output 5%, 4%, 2% are defective bolts. A
bolt is drawn at random from the product & Is
found to be defective.

What are the probabilities that it was


manufactured by
machines A, B & C?

43
Solution
• Let E1, E2, E3 denote the events manufactured
by machines A, B & C respectively.
• Let E denote the event of its being defective.
P(E1) = 0.25; P(E2) = 0.35; P(E3) = 0.40;
Probability of drawing a defective bolt
manufactured by machine A is P(E|E1) = 0.05
Similarly P(E|E2) = 0.04; P(E|E3) = 0.02
Probability that defective bolt selected at random
is manufactured by machine A is given by

44
P(E1). P(E|E1)
P(E1|E) = ------------------------
∑ P(E1). P(E|E1)
i=1 to 3

0.25*0.05
= ----------------------------------------------
0.25*0.05 + 0.35*0.04 + 0.40*0.02
= 25/69

Similarly P(E2|E) = 28/69


= [(0.35*0.04)/(.25*.05+.35*.04+.40*.02)]
P(E3|E) = 16/69
= [(0.40*0.02)/(.25*.05+.35*.04+.40*.02)]

45
Glossary of terms
• Classical Probability: It is based on the idea that certain
occurrences are equally likely.
– Example: - Numbers 1, 2, 3, 4, 5, & 6 on a fair die are
each equally likely to occur.
• Conditional Probability: The probability that an event
occurs given the outcome of some other event.
• Independent Events: Events are independent if the
occurrence of one event does not affect the occurrence of
another event.
• Joint Probability: Is the likelihood that 2 or more events will
happen at the same time.
• Multiplication Formula: If there are m ways of doing one
thing and n ways of doing another thing, there are m x n
ways of doing both.

46
• Mutually exclusive events: A property of a set of categories
such that an individual, object, or measurement is included in
only one category.
• Objective Probability: It is based on symmetry of games of
chance or similar situations.
• Outcome: Observation or measurement of an experiment.
• Posterior Probability: A revised probability based on
additional information.
• Prior Probability: The initial probability based on the present
level of information.
• Probability: A value between 0 and 1, inclusive, describing
the relative possibility (chance or likelihood) an event will
occur.
• Subjective Probability: Synonym for personal probability.
Involves personal judgment, information, intuition, & other
subjective evaluation criteria.
– Example: - A physician assessing the probability of a
patient’s recovery is making a personal judgment based on
what they know and feel about the situation.

47

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy