We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 16
Apartment complex data
Apartment] Floor | No. of | Size of | Distance of apart-
number | number | bedrooms | apartment | ment from lift
(sat) | (meters)
ivy fa T 900.23 | 503.5
2 i 2 1175.34 | 325.6
3 i 3 1785.85 | 4508
4 2 1 900.48 | 500.1
5 2 2 1175.23 | 3245
6 2 3 1785.35 | 456.7
7 3 1 90053 | 502.5
[3_ [3 [2 1176.34 | 325.6
9 3 3 1787.85 | 450.8
10 4 1 900.78 | 500.1
i 4 2 1176.03 _| 325.4
R 4 3 1784.85. | 455.7
Questions
1. Let the random variable be number of bedrooms, what are the
possible values that might be observed?
Answer: 1,2,3
2. Let the random variable be floor number of the apartment.
What are the possible values that might be observed?
Answer: 1, 2,3,4
3. Let the random variable be size of the apartment. What are
the possible values that might be observed?
Answer: [900,180] sq. ft
4. Let the random variable be distance of the apartment from
the lift. What are the possible values that might be observed?
Answer: [324,505] meters
Probability Mass Functions (p.m.f.)
Suppose there are n random variables. There are two condi
function:
1) P(X=xi) > Oe. it cannot be negative
2) The summation of all n random variables should add up to 1
ns to satisfy a probability massProperties of p.m.f
> The probability mass function p(x) is positive for at most a
countable number of values of x. That is, if X must assume
‘one of the values x1, x2, then
1. pla) 20,7 =1,2...
2. p(x) =0 for all other values of x
> Represent it in tabular form
x x E3 x
P(X = x) | pCa) | pba) | pbs)
> Since X must take one of the values x, we have
seta) ed
iss am
Example 1 (for a finite set of discrete random variables):
> Suppose X is a random variable that takes three values, 0, 1,
and 2 with probabilities
> (0) = P(X =0)= 4
> pl) =P(X=1)= 4 1d20
ree)=Px=2)=f
> Tabular form eee
x 1/2
ad
iala} oS”
lala
: ay aan POP
Woy 7 ow bad ted
eta 4 0. ee a
Wooe 78 tas
Example 2 (for an infinite set of disorete random variables):> Suppose X is a random variable that takes values, 0, 1, 2,
..with probabilities
x
> pli) = cA, for some positive A
> What is the value of c?
> Se) =1 yt
g or
Graphing the p.m.
Example: positive skewed distribution
x Tess
PR=a) Loa [03 [02 | Or
Probability
Note - You can have uniform, skewed and symmetric graphs. You can also have graphs that do
not indicate any clear pattern.Rolling a dice twice: X-sum of outcomes
x
273
Pea) TE
Ti
troll
ke
ke
H
=
3
a
3
02
Rolling a dice twice: X-smaller of outcomes
73 T4] 5] 6
T
Eletetetete)
04
03
02
7 | [|
° Eo
1 2 3 ‘ . °
x
Cumulative Distribution Function (c.d.f.)
a
é
Here, the random variable X takes the value of P(xi) cumulatively. Thi
similar to a step
function,> The cumulative distribution function (cdf), F, can be
expressed by
F(a) = P(X < a)
> If X isa discrete random variable whose possible values are
1,90, where Xt < x2 <%5..., then the distribution
function F of X is a step function.
Step function
> Let X be a discrete random variable with the following
probability mass function.
food
|
> The cumulative distribution function of X is given by
0
3 lsa<2
3
a
Fla)=
we
A
o
A
*
Until a=1, F(a) is treated as 0. Then from 1 to 2 for a, F(a) becomes 0.5.Note: Here, X or F(a) is a discrete random variable (not continuous).
‘When you need to solve for P(X=4) as shown below, you need to subtract upper value from the
lower value
Week 10Expectation of a Random variable
Expected value of a random variable is the weighted average of the sum of relative frequencies
taken in the long run of a repeated event.
Expectation of a random variable
Definition
Let X be a discrete random variable taking values x1,%2 The
expected value of X denoted by E(X) and referred to as
Expectation of X is given by
E(x) > xiP(X =x)
=
Rolling a dice once
>
>
>
>
v
Random experiment: Roll a dice once.
Sample space: S = {1,2,3,4,5,6}
Random variable X is the outcome of the roll.
The probability distribution is given by
x T[2];3]4][s5]6
PRER
E(X) = 1§ + 25 +35 +43 +53 +63 = 3.5.
> Does this mean that if we roll a dice once, should we expect
the outcome to be 3.5?
NOI!-the expected value tells us is what we would expect the
average of a large number of rolls to be in the long run.
Bernoulli random variableBernoulli random variable
> A random variable that takes on either the value 1 or 0 is
called a Bernoulli random variable.
> Let X be a Bernoulli random variable that takes on the value
1 with probability p.
> The probability distribution of the random variable is
[x o [1
P(X = x) a)
> Expected value of a Bernoulli random varaible:
E(X) =0x(1—p)+1xp=p
Discrete uniform random variable
Discrete uniform random variable
> Let X be a random variable that is equally likely to takes any
of the values 1,2...
> Probability mass function
x
P(X =x) : A gery a
E(x)= ha Gb --
e| uae --
n
omeProperties of Expectation of a Random variable
Expectation of a function of a random variable
Proposition
Let X be a discrete random variable which takes values x; along
with its probability mass function, P(X = x;). Let g be any real
values function, The expected value of g(X) is
E(@(X)) = Yo as) P(X = x)
Corollary
Ifa and b are constants, E(aX + b) = aE(X) +b
Example
> Let X be a discrete random variable with the following
distribution
TOT2] cu-07t03
J) 02/05] 03)
> Let ¥ = g(X) = X*, What is E(Y)?
> E(Y) =(-1) x02+0%05+17x03=05
> Distribution of Y
¥ Tey]
PY [os;os
ew # FOVariance of a random variable
Variance of a random variable
> Let's denote expected value of a random variable X by the
greek alphabet
Definition
Let X be a random variable with expected value j1, then the
variance of X, denoted by Var(X) or V(X), is defined by
Var(X) = E(X — p)?
> In other words, the Variance of a random variable X measures
the square of the difference of the random variable from its
‘mean, j1, on the average.
Computational formula for Var(X)
y= & bet
Gen We poo FX
Eloxrk) = ableItb
clan E (Rr 2p)
= EW =F 7p
r i.
seep ep = OOFBernoulli random variable
> A random variable that takes on either the value 1 or 0 is
called a Bernoulli random variable.
» Let X be a Bernoulli random variable that takes on the value
1 with probability p.
> The probability distribution of the random variable is
0 [a
{2
1-p|p]
> Expected value of a Bernoulli random varaible:
E(X)=0x(1—p) +1xp=p
> Variance of a Bernoulli random variable:
Var(X) = p — p? = p(1— p),
Discrete uniform random variable
> Let X be a random variable that is equally likely to takes any
of the values 1,2,..
> Probability mass function
x T][2
x 1/4
P(X Wt
> E(X) = NY phi
Discrete uniform random variable
> Let X be a random variable that is equally likely to takes any
of the values 1,2,...,
> Probability mass function
Hala
5
E(x) =
(x2) = (enn
Var(X), -G@)
vvyProperties of Variance of expectation
Variance of a function of a random variable
Proposition
Let X be a random variable, let c be a constant, then
> Var(eX) = eVar(X)
> Var(X +.) = Var(X)
Corollary
Ifa and b are constants, V(aX + b)
E(aX +
But if events are independent, then,
Variance of sum of independent random variables
Result
Let X and Y be independent random variables. Then
Var(X + ¥) = Var(X) + Var Y)
————————— eetVariance of sum of many independent random variables
> The result that the variance of the sum of independent
random variables is equal to the sum of the variances holds for
not only two but any number of random variables.
> Let X1,X2,...,X be k discrete random variables. Then,
Standard Deviation (SD) of expectation
SD of an expectation is the square root of the variance of the expectation.
Note- Here, expectation does not reflect risk but the standard deviation reflects risk and this is
why sd is important. Refer to this video for a detailed explanation on its application.
Properties of standard deviation
Proposition
Let X be a random variable, let c be a constant, then
> SD(cX) = SD(X)
> SD(X +¢) = SD(X)
1. If Var(X) = 4, what is SD(3X)? Answer: 6.
2. If Var(2X +3) = 16, what is SD(X)? Answer: 2.
Important ProblemsWeek 11
Binomial distribution - Bernoulli distribution
‘ABemoulli trial has a sample set of two outcomes, namely, success and failure.
Examples of Bernoulli trials
> Experiment: Tossing a coin: S = {Head, Tail}
> Success: Head
> Failure: Tail
> Experiment: Rolling a dice: S = {1,2,3,4,5,6}
> Success: Getting a six.
> Failure: Getting any other number.
> Experiment: Opinion polls: 5 = { Yes, No}
» Success: Yes
> Failure: No
‘As shown previously, in week 10,
Bernoulli random variable
> A random variable that takes on either the value 1 or 0 is
called a Bernoulli random variable.
» Let X be a Bernoulli random variable that takes on the value
1 with probability p.
> The probability distribution of the random variable is
> Expected value of a Bernoulli random varaible:
E(X) =0x (1—p)+1xp=p
> Variance of a Bernoulli random variable:
Var(X) = p — p? = p(1— p),
Note: Variance of a Bernoulli random variable is a quadratic p-p®. It is a downward facing
parabola with a maximum point at x=-b/a which is % in this case. So, the largest variance in a
Bernoulli trial is %.> The largest variance occurs when p= 3, when success and
failure are equally likely.
> In other words, the most uncertain Bernoulli trials, those with
the largest variance, resemble tosses of a fair coin.
Independent and Identically Distributed (IID) Bernoulli trials
> A collection of Bernoulli trials defines iid Bernoulli random
variables, one for each trial
> The abbreviation iid stands for independent and identically
distributed.
Definition
A collection of random variables is iid if the random variables are
independent and share a common probability distribution
IID Bernoulli trials are called Binomial distributions.
For instance, when a coin is tossed three times:
n=3 independent trials
> Let m=3 independent Bernoulli trials.
> Let p is probability of success.
> The probabilities of outcomes of the independent trials are
S.No | Outcome | Probabilities
1 [Gss)_[pxpxp
2 [Gsf) | pxpx(l-p)
3 |(sfis) [px 2) xp
4 [eff [px —p)x (1p)
5 |(fss) | (—e)xpxp
6
7
(ff) | (1=p) x px (l= p)
px
PX Pn=3 independent trials, X = number of successes
> Let m= 3 independent Bernoulli trials.
> Let p is probability of success.
= number of successes in 3 independent trials.
> The probabilities of outcomes of the independent trials are
‘SINo | Outcome | Number of | Probabilities
1 |Gsa)_|3 pxpxp
2 [sf) [2 px px (1—p)
3 [(sfs)_ [2 px(l—p) XP
4 | tsffy [a px (1~ p) x (1~p)
3 [(fss)__[2 G@=p)xpxp
6 (fst) [a (1=p) xp x (1—p)
7_[ffs)_ [1 =p) x (1 — a) xP
Let n= 3 independent Bernoulli trials.
Let p is probability of success.
Let X = number of successes in 3 independent trials,
The probability distribution of X
vvvy
x 0 1 2) 3
P(X =i) | C= oP [3x px (= py | 3x px (1p) |
For n independent trials,
n independent trials, X = number of successes
> Consider any outcome that results in a total of / successes.
> This outcome will have a total of j successes and (n ~ i)
failures.
> Probability off success and (ni) failures = p! x (1— p)(-9
> There number of different outcomes that result in i successes
and (n— i) failures=(*)
> The probability of i successes in n trials is given by
= a) xp x (1p)?
P(X=