0% found this document useful (0 votes)
18 views54 pages

Probability Lecture 9

Uploaded by

UZAIR KHATTAKH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views54 pages

Probability Lecture 9

Uploaded by

UZAIR KHATTAKH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 54

Probability and Random

Variables
Abu Bakr Siddique
Lecture Outline
• Binomial Probability Law
• Geometric Probability Laws
• Random Variables
• Probability Mass Function
• Expected Value
Binomial Probability Law
Sequential Experiments
Sequential Experiments
1) Independent Experiments
- Binomial Probability law
- Multinomial Probability law
- Geometric Probability law
2) Dependent Experiments
- Markov Chains
Bernoulli Trials
A Bernoulli trial (or binomial trial) is a random
experiment with exactly two possible outcomes,
"success" and "failure", in which the probability of
success is the same every time the experiment is
conducted and in which the trials are independent of
one another, e.g. Coin toss.
Bernoulli Trials
1) Independent trials

2) Only 2 possible outcomes (success and


failure)

3) Probability of success remains the same


throughout the trials.
Binomial Probability Law
If we have n independent Bernoulli trials, then the
probability of k successes is given by the binomial
probability law,

Note if success happens k times, failure will occur (n –


k) times.
Example
• A coin is tossed 3 times. Suppose the probability of
heads is 1/3. What is the probability of getting1
heads in the 3 tosses.
n is number of trials, n = 3 .
k is number of times you want success to happen, k = 1

Let heads denote success and tails denote failure.


P[success] = 1/3 and P[failure] = 2/3.
• Recall that we had previously solved the same
problem using the tree diagram in the lecture on the
theorem on total probability. P[1 head] =
(1/3 × 2/3 × 2/3) +
(2/3 × 1/3 × 2/3) +
(2/3 × 2/3 × 1/3) =
12 / 27 = 0.44
Binomial Probability Law
• The Binomial probability law makes it easier in
finding the probability for more complicated cases.
(E.g. Obtaining 2 heads in 5 tosses with a fair coin)
Binomial Probability Law

The Binomial Coefficient gives the number of


times success occurs.
For example, in the previous example of obtaining
1 heads in 3 tosses, the Binomial Coefficient was 3.
The same result we can find from the tree diagram
(see next slide).
Binomial Probability Law
• The Binomial probability law can only be used
for the case of Bernoulli trials. In such a case the
formula arises naturally from the tree diagram.

• Consider the situation of obtaining 20 heads in 50


tosses (Bernoulli trials) using a biased coin with
probability of heads 1/3. If we could visualize the
tree diagram of this scenario, we will notice that
in all the 50 layers of the tree diagram the
probability of the branches corresponding to the
outcome “heads” will remain 1/3 throughout the
tree diagram.
Binomial Probability Law
• Similarly, the probability of the other branch
(corresponding to the outcome “tails”) will
remain 2/3 throughout the tree diagram.

• Therefore, in this tree diagram there are only two


possible outcomes: success (heads) and failure
(tails). And the probability of both, heads and
tails, remains unchanged.

• The Binomial Coefficient will give the number of


possible ways that 20 successes (heads) can occur
in the 50 trials (tosses).
Binomial Probability Law
• Each of these possible outcomes has exactly the
same probability. This is because for each of these
outcomes you will have to pass through the head
branches 20 times and through the tail braches 30
times.

• In other words, the probability of each outcome


would be the probability of heads to the power 20
multiplied by the probability of tails to the power
30.
Binomial Coefficient
• Also note that Bernoulli trials are like
partitioning a set of n distinct objects into
two sets:

– a set B containing k objects that are


selected

– and a set Bc containing n-k objects


(that were left behind when choosing
the k objects).
Binomial Coefficient
• If success happens k times, failure will occur (n – k)
times.

• So 20 heads in 50 tosses is the same as getting 30


(50-20) tails in the 50 tosses.
Example
• Q1) The probability of error of a certain
communication channel is 0.001. The transmitter
transmits each information bit three times. At the
receiver, a decoder takes a majority vote of the
received bits to decide on what the transmitted bit
was. Find the probability that the receiver will
make an incorrect decision.
Example
• Receiver makes an incorrect decision when none or
only one of the 3 three bit transmissions is correct.
• Let the occurrence of a correct bit on a specific trial
refer to success.
• n = 3 (number of trials/transmissions)
• k = 0 & 1 (number of successes OR number of
correct bits for which you want to find the
probability).
Example
• Q2) A block of 100 bits is transmitted over a binary
communication channel with probability of bit error
p = 0.01.

• (a) If the block has 1 or fewer errors then the


receiver accepts the block. Find the probability
that the block is accepted.

• (b) If the block has more than 1 error, then the block
is retransmitted. Find the probability that M
retransmissions are required.
Example
Block is accepted if 99 or all 100 bits are correct.

Where we have used the Binomial Probability Law


assuming that success refers to having a correct bit in
the block.
Probability of success is 1 – p = 0.99
Probability of failure is p = 0.01
Example
Geometric Probability Law
Geometric Probability Law
A coin is tossed until the first heads (success) appears.
That is, n = 1 to ∞ Bernoulli trials. What is the
probability of success (heads) in the m th trial.
Geometric Probability Law
Example
• Q2) A block of 100 bits is transmitted over a binary
communication channel with probability of bit error
p = 0.01.

• (a) If the block has 1 or fewer errors then the


receiver accepts the block. Find the probability that
the block is accepted.

• (b) If the block has more than 1 error, then the


block is retransmitted. Find the probability that
M retransmissions are required.
Example
We will have M number of retransmissions of the
block, if the block is rejected in the first M trials and
then accepted in trial number M+1.

If block accepted is considered success and block


rejected is taken as failure - then the probability of
failure in the first M trials and success in trial number
M+1 – is given by the Geometric Probability Law.
Example

where the probability of success p denotes the


probability that the block is accepted, which is 0.7357
(see part a)
Random Variables
Function
A function is a relationship between members of a set
with another set.

For example, the


function f(x) = x2
maps a member of
the set X to a
member of the set
Y.
Function
The set from which an element is mapped is called
the Domain of the Function.

The set on to which an element is mapped is called


the range of the Function.

In this case the


Domain of the
function f(x) is the set
X and the Range of the
function f(x) is the set
Y.
Random Variable
In probability theory, a random variable is a function
which maps elements from a set (or from the sample
space) to another set.
The random variable
X(s) maps outcomes of
the sample space, S, to
the set, SX.
SX gives the range of
values, SX = {x0, x1,
…., xn} that the
random variable, X,
Random Variable - Example
Let the random variable, X, be defined as the
number of heads in 3 tosses of a fair coin. Show
the mapping from S to SX.
Random Variable - Example
Probability Mass Function
Probability Mass Function (pmf)
The pmf of the random variable, X, is the probability
that the random variable takes on the value x j.

For example, if the random variable X is defined as


the number of heads in the 3 tosses, then the pmf of X
is given by the probability that the random variable
takes on the value 0, 1, 2, 3.
Probability Mass Function (pmf)

Curly Brackets denote outcomes


Probability Mass Function (pmf)
The pmf values are defined as probabilities.

Therefore, pmf values must be non-negative,

pX(xj) ≥ 0 for all x

Also, the sum of all pmfs is 1


Probability Mass Function (pmf)
We can check this last equation by adding pmf values
of the 3 coin tosses example.
Graph of pmf
Usually when we refer to pmf, we are referring to a
graph of the pmf.

The graph of pmf is given by the graph of these


probabilities pX(x) versus x (values that the random
variable can take).

Draw the pmf of the random variable X, where X is


defined as the number of heads in 3 tosses.
Graph of pmf
Expected Value
Probabilities as Masses
We discussed earlier that probabilities of events
can be considered as areas on a Venn diagram.

A similar concept is to think of probabilities as


masses.

This particular concept is useful when we look at


the pmf of a random variable.
Center of Probability Mass
We say that the pmf in the previous example is
centered around the point X = 1.5, since there are
equal probability masses on the right and left of this
point.
Expected Value of X
The center of probability mass in the pmf graph is
actually called the expected value of the random
variable X,

Or the mean of the random variable.

You can think of the expected value (or the mean) as


the average value that the random variable can take.
Expected Value of X
Using the expression for E[X], we can find the
expected value for the 3 coin tosses example.

The answer should come out to be 1.5, since that is


what the pmf graph suggests.
Expected Value of X
Properties of E[X]
If α and β are constants,

(a) E[αX] = αE[X]

(b) E[α] = α and E[β] = β

(c) E[αX + β] = E[αX] + E[β] = αE[X] + β


Function of a random variable
Function of a random variable is also a
random variable.

For example, if the random variable X


denotes the number of heads in 3 tosses and
the random variable Y is defined as follows,

Y = X2 (if X = 2 or X = 3)
Y = 0 (otherwise)

Then Y is also a random variable.


Expected Value of the Function of a random
variable

There are two ways to find E[Y].

1st Method

Standard Method using the Definition of E[].


This method requires using the probabilities
of the random variable Y.
Expected Value of the Function of a random
variable

E[Y] = (0 × 4 / 8) + (4 × 3 / 8) + (9 × 1 / 8)
E[Y] = 21 / 8
Expected Value of the Function of a random
variable

2nd Method

We find E[Y] using the probabilities of the


random variable X.

E[Y] = (0 × 1/8) + (0 × 3/8) + (22 × 3/8) + (32


× 1/8) = 21 / 8.
Expected Value of Y

E[Y] = (0 × 1/8) + (0 × 3/8) + (22 × 3/8)


+ (32 × 1/8) = 21 / 8.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy