0% found this document useful (0 votes)
89 views49 pages

Unit I (Part 2)

The document defines discrete and continuous random variables. A random variable is a function that maps outcomes of a random experiment to real numbers. Discrete random variables can take countable values, while continuous random variables can take any value in an interval. The document provides examples and definitions of probability mass functions for discrete variables and probability density functions for continuous variables. It also discusses cumulative distribution functions and their properties. Finally, it defines mathematical expectations for both discrete and continuous random variables.

Uploaded by

M jhansi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views49 pages

Unit I (Part 2)

The document defines discrete and continuous random variables. A random variable is a function that maps outcomes of a random experiment to real numbers. Discrete random variables can take countable values, while continuous random variables can take any value in an interval. The document provides examples and definitions of probability mass functions for discrete variables and probability density functions for continuous variables. It also discusses cumulative distribution functions and their properties. Finally, it defines mathematical expectations for both discrete and continuous random variables.

Uploaded by

M jhansi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Definition.

A random variable 𝑋, is a function from the sample space 𝑆 to


the real numbers, i.e., 𝑋 is a rule which assigns a number 𝑋(𝑠) for each
outcome 𝑠 ∈ 𝑆.
(or)

A random variable is defined as function that maps the sample space to a


set of real values.
𝑋: 𝑆 → 𝑅
where 𝑋 is the random variable, 𝑆 is the sample space and 𝑅 is the set of
real numbers. (OR)

Simply,

Any real valued function defined on the sample space of a


radom experiment is called Random variable or stochastic
variable

Note: Why is X called a random variable? It’s ‘random’ because its


value depends on a random outcome of an experiment.

MVS Page 1
Example-1 Consider the sample space of throwing of two dice

𝑆 = {(1, 1), (1, 2), . . . , (6, 6)}


the random variable X corresponding to the sum is

𝑋(1, 1) = 2, 𝑋(1, 2) = 3,
and in general 𝑋(𝑖, 𝑗) = 𝑖 + 𝑗.

Here 𝑋: 𝑆 →

is a function from sample space to real number set

∴ X is a random variable

MVS Page 2
Thus, Intuitively by a random variable RV we mean a real number X
connected with the outcome of a random experiment S.
Example-2
If S consists of two tosses of a coin, we may consider the random variable
which is the number of heads ( 0, 1 or 2)
Outcome HH HT TH TT
Value of X 2 1 1 0

Thus to each outcome, there corresponds a real number.

Random Variable

Discrete Random variable Continuous Random Variable

MVS Page 3
There are two types of random variables

1. Discrete Random variable


2. Continuous Random variable

Def:

If X is a random variable which can take a finite number or countably


infinite number of values ,then X is called a Discrete Random variable.

Examples of discrete random variables

1.Number of children in a family,

2.Number of requests sent to a web server

3. Number shown when a die is thrown

4. Number of transmitted bits received in error

5. The number of patients in a doctor's surgery,

6. The number of defective light bulbs in a box of ten.

6. The number of failures in an electronic device in its first five years of


operation.

MVS Page 4
Also,
if the possible values are any of these

 {1,2,3, … … }
 {… … . . −2, −1,0,1,2,3, … … }
 {0,2,4,6, … … }
 {0,0.5,1.0,1.5,2.0, … . … }
 any finite set

Then the random variable is discrete.

Def:

If 𝑋 is a random variable which can take all


values in an interval ,then 𝑋 is called a continuous random variable.
(or)

A random variable 𝑋 which takes all possible values in a given


interval is called a continuous random variable

Examples:
1.Suppose the temperature in a city lies between 30⁰ and 45⁰
centigrade. The temperature can take any value in the interval 30⁰
to 45⁰.
2. The length of time I have to wait at the bus stop for a number 2 bus.

3.Blood pressure,
MVS Page 5
4. Weight

5. Speed of a car.

6. the time taken by a program for execution.

7. The distance the vehicle travels before stopping ,When brakes are
applied to a moving vehicle.

Also
If the possible values are any of these

 all numbers between 0 and ∞ i.e (0 , ∞ )


 all the numbers between −∞ and ∞ i.e (−∞ , ∞)
 all numbers between 0 and 1
then the random variable is continuous.

Note: A continuous random variable is not defined at specific values.


Instead, it is defined over an interval of values, and is represented by
the area under a curve. The probability of observing any single value is
equal to 0, since the number of values which may be assumed by the
random variable is infinite.

Note-2 :Probability Distributions. . .

“Randomness” of a random variable is described by a probability


distribution. Informally, the probability distribution specifies the
probability or likelihood for a random variable to assume a particular
value.

MVS Page 6
Probability Mass Function:
If X is a discrete RV which can take the values 𝑥1, 𝑥2 ,…….𝑥𝑛

such that 𝑃(𝑋 = 𝑥𝑖 ) = 𝑝𝑖 , then the function P is called function or


probability mass function, provided it satisfies the following conditions

(i) 𝑝𝑖 > 0
(ii) ∑𝑖 𝑝𝑖 = ∑𝑖 𝑃(𝑋 = 𝑥𝑖 ) = 1
The collection of pairs {(𝑥1 , 𝑝1 ), (𝑥2 , 𝑝2 ), … … . . } ,is called the probability
distribution of the random variable X.

Sometimes probability distribution of the random variable 𝑋


can be displayed by table

𝑋 = 𝑥𝑖 𝑥1 𝑥2 ……… 𝑥𝑖 𝑥𝑛
𝑃(𝑋 = 𝑥𝑖 ) 𝑝1 𝑝2 𝑝𝑖 𝑝𝑛

Probability Density Function


If X is a continuous random variable such that
1 1
𝑃 (𝑥 − 𝑑𝑥 < 𝑋 < 𝑥 + 𝑑𝑥) = 𝑓 (𝑥 )𝑑𝑥
2 2
Then the function 𝑓 (𝑥 ) is called probability density function (pdf) of the
random variable 𝑋, provided it satisfies the following conditions

(i) 𝑓 (𝑥 ) ≥ 0 , over the range values of 𝑋, say 𝑎 < 𝑥 < 𝑏


𝑏
(ii) ∫𝑎 𝑓 (𝑥 ) = 1

MVS Page 7
The curve 𝑦 = 𝑓(𝑥) is called probability curve of the Random variable X

Note: 1.When X is a continuous random variable then

𝑃(𝑋 = 𝑎) = ∫ 𝑓 (𝑥 )𝑑𝑥 = 0
𝑎

2. When X is a continuous random variable ,then

MVS Page 8
𝑃 (𝑎 ≤ 𝑋 ≤ 𝑏 ) = 𝑃 (𝑎 < 𝑋 ≤ 𝑏 ) = 𝑃 (𝑎 ≤ 𝑋 < 𝑏 )
= 𝑃 (𝑎 < 𝑋 < 𝑏 )
𝑏
= ∫𝑎 𝑓(𝑥 ) 𝑑𝑥

Probability as an Area

MVS Page 9

𝑷(𝑿 > 𝒂) = ∫𝒂 𝒇(𝒙)𝒅𝒙=Area of the shaded
portion

1.

MVS Page 10
MVS Page 11
MVS Page 12
If X is an RV, discrete or continuous , then
𝑃(𝑋 ≤ 𝑥) is called the cumulative distributive function of 𝑋 or
simply distribution function of 𝑋 and it is denoted by 𝐹(𝑥)

If 𝑋 is discrete, then Cumulative


Distribution Function of 𝑋 is given by
𝐹 (𝑥) = ∑𝑥𝑖 ≤𝑥 𝑝(𝑥𝑖 )

If 𝑋 is continuous , then Cumulative Distribution


Function of 𝑋 is given by
𝑥

𝐹 (𝑥) = 𝑃(−∞ < 𝑋 ≤ 𝑥) = ∫ 𝑓(𝑥)𝑑𝑥


−∞

MVS Page 13
The cumulative distribution function 𝐹 (𝑥 ) for a
continuous random variable 𝑋 expresses the probability that 𝑋 does not
exceed the value of 𝑥

MVS Page 14
Properties of the cdf 𝐹(𝑥)
1. 𝐹(𝑥) is a non decreasing function of 𝑥 i.e
𝑥1 < 𝑥2
⇒ 𝐹 (𝑥1 ) < 𝐹(𝑥2 )
2. 𝐹 (−∞) = 0 , 𝐹 (∞) = 1

𝐹 (∞) = 𝑃(−∞ < 𝑋 ≤ ∞) = ∫ 𝑓 (𝑥)𝑑𝑥 = 1


−∞
3. If X is a discrete random variable taking values 𝑥1, 𝑥2 ,…….𝑥𝑛
where, 𝑥1< 𝑥2 <…… <𝑥𝑖−1< 𝑥𝑖< … …
then 𝑃(𝑋 = 𝑥𝑖 ) = 𝐹 (𝑥𝑖 ) − 𝐹(𝑥𝑖−1 )
In case of continuous RV
𝑃(𝑎 ≤ 𝑋 ≤ 𝑏) = 𝐹 (𝑏) − 𝐹(𝑎)
4. In case of a continuous RV ,
𝑑
𝐹(𝑥) = 𝑓(𝑥)
𝑑𝑥

MVS Page 15
Note:

For discrete random variables, we look up the value of a


PMF at a single point to find its probability 𝑃(𝑋 = 𝑥)

For continuous random variables, we take an integral of a


PDF over a certain interval to find its probability that X will
fall in that interval.

MVS Page 16
Mathematical Expectations
Definition: Let X be a continuous random variable with p.d.f. 𝑓 (𝑥 ).

Then the Mathematical expectation of 𝑋 denoted by 𝐸(𝑋) and is

given by

𝐸 (𝑋) = ∫ 𝑥 𝑓(𝑥 ) 𝑑𝑥
−∞

𝐷𝑒𝑓𝑖𝑛𝑖𝑡𝑖𝑜𝑛: Let 𝑋 be a discrete random variable with probability mass


function 𝑓 (𝑥 ) or 𝑃(𝑋 = 𝑥)

Then the Mathematical expectation of 𝑋 denoted by 𝐸(𝑋) and is defined


as

𝐸 (𝑋) = ෍ 𝑥 𝑓(𝑥 ) = ෍ 𝑥 𝑃(𝑋 = 𝑥)


𝑥 𝑥

MVS Page 17
Let 𝑔(𝑋) be any function of 𝑋. Then

𝑖) 𝐸 [𝑔(𝑋)] = ∫−∞ 𝑔(𝑥 ) 𝑓(𝑥 ) 𝑑𝑥 , when 𝑋 is continuous RV

𝑖𝑖) 𝐸 [𝑔(𝑋)] = ∑𝑥 𝑔(𝑥 ) 𝑓(𝑥 ) = ∑𝑥 𝑔(𝑥 ) 𝑃(𝑋 = 𝑥),

When 𝑋 is discrete

In case of discrete random variable 𝑋,


1. 𝐸 (𝑋) = ∑𝑥 𝑥 𝑃(𝑋 = 𝑥)

2. 𝐸 (𝑋 2 ) = ෍ 𝑥 2 𝑃(𝑋 = 𝑥 )

3. 𝐸 (𝑋 3 ) = ∑ 𝑥 3 𝑃(𝑋 = 𝑥)
etc

MVS Page 18
In case of Continuous random variable 𝑋


1.𝐸 (𝑋) = ∫−∞ 𝑥 𝑓(𝑥 )𝑑𝑥

2. 𝐸 (𝑋 2 ) = ∫ 𝑥 2 𝑓(𝑥 )𝑑𝑥
−∞

3. 𝐸 (𝑋 3 ) = ∫−∞ 𝑥 3 𝑓(𝑥 )𝑑𝑥

etc

Expectation Properties:

Important Results:
∑ 𝑝𝑖 𝑥𝑖
∵ 𝑀𝑒𝑎𝑛 = ∑ 𝑝𝑖
= ∑ 𝑝𝑖 𝑥𝑖 as
1. 𝐸 (𝑋) = 𝜇 = 𝑚𝑒𝑎𝑛 =𝑥̅
∑ 𝑝𝑖 = 1

2. 𝐸 (𝑋 + 𝑘 ) = 𝐸 (𝑋) + 𝑘, where 𝑘 is a constant.

3. 𝐸 (𝑘𝑋) = 𝑘𝐸 (𝑋)

4. 𝐸 (𝑎𝑋 + 𝑏) = 𝑎𝐸 (𝑋) + 𝑏 where a, b are constants

MVS Page 19
5. 𝐸 (𝑋 + 𝑌) = 𝐸 (𝑋) + 𝐸(𝑌), provided 𝐸 (𝑋) 𝑎𝑛𝑑 𝐸(𝑌) exists.

6. 𝐸 (𝑋 − 𝜇 ) = 𝐸 (𝑋 − 𝑥̅ ) = 0 why ?

7. 𝐸 (𝑋𝑌) = 𝐸 (𝑋). 𝐸(𝑌) when 𝑋, 𝑌 are independent variables.

1 1
8. 𝐸 ( ) ≠
𝑋 𝐸 (𝑋)

Formulas for Mean and Variance of the Distributions


1. Mean: The mean 𝜇 ( 𝑜𝑟 ) 𝑥̅ of the distribution is given by
∑ 𝑝𝑖 𝑥𝑖
Mean = 𝜇 = ∑ 𝑝𝑖
= ∑ 𝑝𝑖 𝑥𝑖 ∵ ∑ 𝑝𝑖 = 1

∴ 𝑀𝑒𝑎𝑛 = 𝜇 = 𝐸 (𝑋) = ∑ 𝑥 𝑃(𝑋 = 𝑥 ) = ∑ 𝑝𝑖 𝑥𝑖

If 𝑋 is discrete random variable , then

𝑀𝑒𝑎𝑛 = 𝜇 = 𝐸 (𝑋) = ෍ 𝑥 𝑃(𝑋 = 𝑥 ) = ෍ 𝑝𝑖 𝑥𝑖

If 𝑋 is continuous random variable , then


𝑀𝑒𝑎𝑛 = 𝜇 = 𝐸 (𝑋) = ∫ 𝑥 𝑓 (𝑥 )𝑑𝑥


−∞

MVS Page 20
2.Variance:
Variance of the probability distribution of a random variable 𝑋
denoted by the symbol 𝜎 2 is defined as

𝑉𝑎𝑟(𝑋) = 𝑉(𝑋) = 𝐸 [𝑋 − 𝐸(𝑋)]2 where 𝑀𝑒𝑎𝑛 = 𝜇 = 𝐸 (𝑋)

We have 𝑉 (𝑋) = 𝜎 2 = ∑𝑛𝑖(𝑥𝑖 − 𝜇)2 𝑝𝑖

=∑𝑛𝑖(𝑥𝑖 2 𝑝𝑖 + 𝜇 2 𝑝𝑖 − 2𝑥𝑖 𝜇𝑝𝑖 )


𝑛 𝑛

= ෍ 𝑥𝑖 2 𝑝𝑖 + 𝜇 2 − 2𝜇 ෍ 𝑥𝑖 𝑝𝑖
𝑖 𝑖

𝜎2 = ∑𝑛𝑖 𝑥𝑖 2 𝑝𝑖 + 𝜇 2 − 2𝜇 2 ∵ 𝜇 = ∑𝑛𝑖 𝑥𝑖 𝑝𝑖

𝜎2 = ∑𝑛𝑖 𝑥𝑖 2 𝑝𝑖 − 𝜇 2

𝜎2 = 𝐸 (𝑋 2 ) − [𝐸(𝑋)]2

Variance = 𝜎 2 = 𝐸 (𝑋 2 ) − [𝐸(𝑋)]2

Standard deviation=S.D= 𝜎 = √𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒

Results:
 𝑉(𝐾 ) = 0, where K is a constant
𝑉(𝐾 ) = 𝑉 (𝐾𝑋 0 ) = 𝐸(𝐾𝑋 0 )2 − [𝐸(𝐾𝑋 0 )]2

= 𝐸(𝐾)2 − [𝐸(𝐾)]2 = 𝐾 2 − 𝐾 2 = 0

MVS Page 21
i.e Variance of any constant = 0
2. 𝑉(𝑘𝑋 ) = 𝑘 2 𝑉(𝑋)

𝑉(𝐾𝑋 ) = 𝐸(𝐾𝑋)2 − [𝐸(𝐾𝑋)]2

= 𝐾 2 𝐸(𝑋)2 − 𝐾 2 [𝐸(𝑋)]2

= 𝐾 2 {𝐸(𝑋 2 ) − 𝐾 2 [𝐸(𝑋)]2 }
= 𝐾 2 {𝐸(𝑋 2 ) − [𝐸(𝑋)]2 } = 𝐾 2 𝑉(𝑋)

3. 𝑉(𝑋 + 𝑘) = 𝑉 (𝑋) where K is a constant

4. 𝑉(𝑎𝑋 + 𝑏) = 𝑎2 𝑉(𝑋) where a, b are constants

Proof: Let 𝑌 = 𝑎𝑋 + 𝑏

𝑉 (𝑌) = 𝐸 (𝑌 2 ) − [𝐸(𝑌)]2
= 𝐸 ((𝑎𝑋 + 𝑏)2 ) − [𝐸(𝑎𝑋 + 𝑏)]2

= 𝐸 (𝑎2 𝑋 2 + 2𝑎𝑏𝑋 + 𝑏2 ) − {𝑎𝐸 (𝑋) + 𝑏}2

= 𝑎2 𝐸(𝑋 2 ) + 2𝑎𝑏𝐸 (𝑋) + 𝑏2


2
−𝑎2 (𝐸 (𝑋)) − 2𝑎𝑏 𝐸 (𝑋) − 𝑏2
2
= 𝑎2 𝐸 (𝑋 2 ) − 𝑎2 (𝐸 (𝑋))

= 𝑎2 {𝐸 (𝑋 2 ) − [𝐸(𝑋)]2 } = 𝑎2 𝑉(𝑋)
𝑉(𝑎𝑋 + 𝑏) = 𝑎2 𝑉(𝑋)

MVS Page 22
𝑉(𝑎𝑋 + 𝑏) = 𝑎2 𝑉(𝑋)

 The expected value µ = 𝐸(𝑋) is a measure of location or central


tendency.
 The standard deviation σ is a measure of the spread or scale.
 The variance = 𝜎 2 = 𝑉 (𝑋) is the square of the standard
deviation.

Expectation The mean, expected value, or expectation of a


random variable 𝑋 is written as 𝐸(𝑋) or 𝜇. The expectation is defined
differently for continuous and discrete random variables.

Note: The 𝑟 𝑡ℎ moment about origin ,denoted by 𝜇𝑟′ is defined as

𝜇𝑟′ = 𝐸(𝑋 𝑟 )

𝜇1′ = 𝐸(𝑋) , 𝜇2′ = 𝐸(𝑋 2 ) , 𝜇3′ = 𝐸(𝑋 3 ) etc

MVS Page 23
Moment Generating Function (MGF)

Def : The moment generating function (MGF) of a random variable ‘𝑋’


(about origin) whose probability function 𝑓(𝑥) defined as

𝑀𝑋 (𝑡) = 𝐸 [𝑒 𝑡𝑋 ]

𝐸 [𝑒 𝑡𝑋 ] = ∑𝑥 𝑒 𝑡𝑥 𝑝(𝑥),

If 𝑋 is aType equation
discrete here.
random variable with pmf 𝑝(𝑥),

then 𝑀𝑋 (𝑡) = 𝐸 [𝑒 𝑡𝑋 ] = ∑𝑥 𝑒 𝑡𝑥 𝑝(𝑥)

If 𝑋 is a continuous random variable with pdf 𝑓(𝑥),



then 𝑀𝑋 (𝑡) = 𝐸 [𝑒 𝑡𝑋 ] = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥 )𝑑𝑥

where 𝑡 is a real parameter

MVS Page 24
𝑛
𝑡
𝐸 (𝑋 𝑛 ) = 𝑐𝑜𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡 𝑜𝑓 𝑖𝑛 𝑀𝑋 (𝑡 )
𝑛!

We have

𝑀𝑋 (𝑡) = 𝐸 [𝑒 𝑡𝑋 ]
𝑡𝑋 (𝑡𝑋)2 (𝑡𝑋)3 (𝑡𝑋)𝑛
𝑀𝑋 (𝑡) = 𝐸 {1 + + + + −−−+ + − −}
1! 2! 3! 𝑛!

𝑡 𝑡2 2
𝑡𝑛
𝑀𝑋 (𝑡) = 1 + 𝐸 (𝑋) + 𝐸 (𝑋 ) + − − − + 𝐸 (𝑋 𝑛 ) + ⋯ …
1! 2! 𝑛!
𝑡 𝑡2 𝑡𝑛
= 1+ 𝜇1′ + 𝜇2′ + −−−+ 𝜇𝑛′ + ⋯ …
1! 2! 𝑛!

𝑛)
𝑡𝑛
∴ 𝐸 (𝑋 = 𝑐𝑜𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑡 𝑜𝑓 𝑖𝑛 𝑀𝑋 (𝑡)
𝑛!
Type equation here.
𝑑 𝑑 𝑡 𝑡2 𝑡𝑛
Also 𝑀𝑋 (𝑡) = {1 + 𝐸 (𝑋) + 𝐸 (𝑋 2 ) + − − − + 𝐸 (𝑋𝑛 )}
𝑑𝑡 𝑑𝑡 1! 2! 𝑛!

2
𝑛𝑡𝑛−1
= 𝐸 (𝑋) + 𝑡𝐸(𝑋 ) + … + 𝐸(𝑋𝑛 ) + ⋯.
𝑛!

Put 𝑡 = 0
𝑑
{
𝑑𝑡
𝑀𝑋 (𝑡)} = 𝐸(𝑋)
𝑡=0

MVS Page 25
Similarly
𝑑2
{ 2 𝑀𝑋 (𝑡)} = 𝐸 (𝑋 2 )
𝑑𝑡 𝑡=0

𝑑3
{ 3 𝑀𝑋 (𝑡)} = 𝐸 (𝑋 3 )
𝑑𝑡 𝑡=0

Etc

𝑑
𝐸 (𝑋) = ൜ 𝑀𝑋 (𝑡)ൠ
𝑑𝑡 𝑡=0

2) 𝑑2
𝐸 (𝑋 ={ 𝑀𝑋 (𝑡)}
𝑑𝑡 2 𝑡=0

3) 𝑑3
𝐸 (𝑋 ={ 𝑀𝑋 (𝑡)}
𝑑𝑡 3 𝑡=0

MVS Page 26
Example-1

When a die is thrown , X , denotes the number that


turns up. Find 𝐸(𝑋), 𝐸(𝑋 2 ), 𝑉(𝑋)

Find 𝐸(𝑋), 𝐸(𝑋 2 ), 𝑉(𝑋)

Solution : Let 𝑋 be a R.V., denoting the number that turns up in a die.


1
Here 𝑋 takes values 1,2,3,4,5and 6 with probability for each.
6

𝑋= 𝑥 1 2 3 4 5 6
𝑃(𝑋 = 𝑥) 1 1 1 1 1 1
6 6 6 6 6 6

(i) 𝐸 (𝑋) = ∑6𝑖=1 𝑥𝑖 𝑃(𝑥𝑖 ) =


1 1 1 1 1 1 21 7
= 1( )+ 2( ) +3( ) +4( ) + 5( )+ 6( ) = =
6 6 6 6 6 6 6 2
7
𝐸 (𝑋 ) =
2

ii)𝐸(𝑋 2 )= ∑6𝑖=1 𝑥 2 𝑖 𝑃(𝑥𝑖 )


1 1 1 1 1 1
= 12 ( ) + 22 ( ) + 32 ( ) + 42 ( ) + 52 ( ) + 62 ( )
6 6 6 6 6 6

1+4+9+16+25+36 91
= =
6 6

𝟐)
91
𝟐 7 2 35
𝐕(𝐗) = 𝐄(𝐗 − (𝐄(𝐗)) = −( ) =
6 2 12

MVS Page 27
Example-2

Two unbiased dice are thrown. Find the expected


values of the sum of numbers of points on them.

Let X be the random variable representing the sum of the numbers

Obtained on two dice .


𝑋(𝑎, 𝑏) = 𝑎 + 𝑏
Then X can take one of the values 2,3,4,5,6,7,8,9,10,11,12.

The probability function of the RV 𝑋 is

𝑋 =𝑥 2 3 4 5 6 7 8 9 10 11 12
𝑃(𝑋 = 𝑥 ) 1 2 3 4 5 6 5 4 3 2 1
36 36 36 36 36 36 36 36 36 36 36

𝐸 (𝑋) = ෍ 𝑥. 𝑃(𝑥 ) =
𝑥

MVS Page 28
1 2 3 4 5
= 2( )+ 3( )+ 4( ) +5( ) +6( )
36 36 36 36 36

6 5 4 3 2 1
+7 ( ) + 8 ( ) + 9 ( ) + 10 ( ) + 11 ( ) + 12 ( )
36 36 36 36 36 36
=
252
= =7
36

Example-3

Solution:

𝑋=𝑥 0 1 2 3 4 5 6 7 8
𝑃(𝑋 = 𝑥) a 3a 5a 7a 9a 11a 13a 15a 17a

MVS Page 29
(i) Since 𝑃(𝑥) is a probability mass function , we have
∑8𝑖=1 𝑝(𝑥𝑖 ) = 1

a+3a+5a+7a+9a+11a+13a+15a+17a= 1

81a = 1
1
a=
81

(ii) 𝑃(𝑋 < 3) = 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1) + 𝑃(𝑋 = 2)


9 1
= 𝑎 + 3𝑎 + 5𝑎 = 8𝑎 = =
81 9

1 8
𝑃(𝑋 ≥ 3) = 1 − 𝑃(𝑋 < 3) = 1 − =
9 9
(iii) 𝑃(0 < 𝑋 < 5) =
𝑃(𝑋 = 1) + 𝑃(𝑋 = 2) + 𝑃(𝑋 = 3) + 𝑃(𝑋 = 4)
16
= 𝑎 + 3𝑎 + 5𝑎 + 7𝑎 = 16𝑎 =
81

MVS Page 30
(iv) Distribution function of 𝑋

𝑋=𝑥 𝐹(𝑥 ) = 𝑃(𝑋 ≤ 𝑥)


0 1
𝐹 (0) = 𝑃(𝑋 ≤ 0) = 𝑃(𝑋 = 0) =
81
1 4
𝐹 (1) = 𝑃(𝑋 ≤ 1) = 𝑎 + 3𝑎=
81
2 9
𝐹 (2) = 𝑃(𝑋 ≤ 2) = 𝑎 + 3𝑎 + 5𝑎 =
81
3 16
𝐹 (3) = 𝑃(𝑋 ≤ 3) = 𝑎 + 3𝑎 + 5𝑎 + 7𝑎 =
81
4 25
𝐹 (4) = 𝑃(𝑋 ≤ 4) = 𝑎 + 3𝑎 + 5𝑎 + 7𝑎 + 9𝑎 =
81
36
5 𝐹 (5) = 𝑃(𝑋 ≤ 5)=
81

6 49
𝐹 (6) = 𝑃(𝑋 ≤ 6)=
81

7 64
𝐹 (7) = 𝑃(𝑋 ≤ 7)=
81

8 𝐹 (8) = 𝑃(𝑋 ≤ 8)= 𝑎 + 3𝑎 + 5𝑎 + 7𝑎 + 9𝑎 + 11𝑎 +


13𝑎 + 15𝑎 + 17𝑎=1

MVS Page 31
Example-4

MVS Page 32
Example-5

Solution:

MVS Page 33
Example-6

Solution:

MVS Page 34
Example-7

Let X a continuous random variable denotes the time a person waits for
an elevator to arrive .The PDf of X is

given by

Find Mean and Variance of 𝑋

Solution:
2
Mean=𝐸 (𝑋) = ∫0 𝑥 𝑓 (𝑥 )𝑑𝑥

Thus, we expect a person will wait 1 minute for the elevator on average

MVS Page 35
2 7 1
𝑉 (𝑋) = 𝐸 (𝑋 2 ) − (𝐸 (𝑋)) = − 12 =
6 6
Example-8

Find the constant 𝑐 such that the function


2
𝑐𝑥 , 0<𝑥<3
𝑓 (𝑥 ) = ൜
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Is a density function , and b)find 𝑃(1 < 𝑥 < 2)


Solution: Since 𝑓 (𝑥 ) is apdf , it satisfies ∫−∞ 𝑓 (𝑥 ) = 1
3

∫ 𝑐𝑥 2 dx = 1
0
3
𝑥3
𝑐[ ] = 1
3 0

1
𝑐=
9

MVS Page 36
21 7
b) 𝑃(1 < 𝑋 < 2) = ∫1 𝑥 2 𝑑𝑥 =
9 27

Example-9

a) Find K, (b) Evaluate 𝑃(𝑋 < 2) and 𝑃(−2 < 𝑋 < 2)


(c ) find the cdf (Cumulative distribution function) and
(d )Evaluate mean of 𝑋

Solution: we have ∑𝑥 𝑝(𝑥 ) = 1

0.1 + 𝑘 + 0.2 + 2𝑘 + 0.3 + 3𝑘 = 1


0.6 + 6𝑘 = 1
6𝑘 = 0.4
0.4 4 1
𝑘= = =
6 60 15
∴ probability distribution becomes

MVS Page 37
b) 𝑃 ( 𝑋 < 2) =
𝑃 (𝑋 = −2) + 𝑃(𝑋 = −1) + 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1)
1 1 1 2 1
= + + + =
10 15 5 15 2

𝑃(−2 < 𝑋 < 2) = 𝑃(𝑋 = −1) + 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1)


1 1 2 2
= + + =
15 5 15 5

Where 𝐹(x) cdf of a RV 𝑋


d) Mean of X = 𝐸 (𝑋) = ∑𝑥 𝑥. 𝑃(𝑥)

MVS Page 38
Example-10

Solution:

We have ∑ 𝑝(𝑥 ) = 1

∴ 0 + 𝑘 + 2𝑘 + 2𝑘 + 3𝑘 + 𝑘 2 + 2𝑘 2 + 7𝑘 2 + 𝑘 = 1
i.e 10𝑘 2 + 9𝑘 = 1
(10𝑘 − 1)(𝑘 + 1) = 0
1
𝑘= or -1
10
1
∴𝑘=
10

MVS Page 39
Example-11

Solution:

(a) If 𝑝(𝑥) is to be a pdt , then (i)𝑝(𝑥) ≥ 0 and



(ii) ∫−∞ 𝑓 (𝑥 ) = 1

MVS Page 40
𝑥2

Clearly for 𝑥 ≥ 0 , 𝑥𝑒 2 ≥ 0 and

∞ ∞
𝑥2
−2
∫ 𝑓 (𝑥 )𝑑𝑥 = ∫ 𝑥𝑒 𝑑𝑥
0 0

∞ −𝑡 𝑥2
= ∫0 𝑒 𝑑𝑡 𝑝𝑢𝑡 = t, xdx = dt
2

= −(𝑒 −∞ − 𝑒 0 ) = 1

∫ 𝑓 (𝑥 )𝑑𝑥 = 1
0

∴ 𝑓 (𝑥 )is a pdf of a random variable

MVS Page 41
Example-12

A continuous RV X has a pdf

𝑓(𝑥 ) = 𝑘𝑥 2 𝑒 −𝑥 ; 𝑥 ≥ 0. Find 𝑘, Mean and


Variance

Solution:

Γ(𝑛) = ∫ 𝑥 𝑛−1 𝑒 −𝑥 𝑑𝑥
0

Γ(𝑛 + 1) = 𝑛!

By the property of pdf


∫ 𝑓 (𝑥 )𝑑𝑥 = 1
0

∫ 𝑘𝑥 2 𝑒 −𝑥 𝑑𝑥 = 1
0

𝑘Γ(3) = 1
𝑘 (2) = 1
1
∴𝑘=
2

MVS Page 42
Mean of 𝑋 =

(or)

Mean =𝐸 (𝑋) = ∫0 𝑥 𝑓(𝑥)𝑑𝑥

1 2 −𝑥
=∫𝑥 𝑥 𝑒 𝑑𝑥
2
0

1 1 3!
= ∫ 𝑥 3 𝑒 −𝑥 𝑑𝑥 = Γ(4) = = 3
2 2 2
0

2
𝑉 (𝑋) = 𝐸 (𝑋 2 ) − (𝐸 (𝑋))

𝐸 (𝑋 2 ) = ∫ 𝑥 2 𝑓(𝑥)𝑑𝑥 =
0
∞ ∞
1 1 1 4!
= ∫ 𝑥 2 𝑥 2 𝑒 −𝑥 𝑑𝑥 = ∫ 𝑥 4 𝑒 −𝑥 𝑑𝑥 = Γ(5) = = 12
2 2 2 2
0 0

MVS Page 43
2
𝑉 (𝑋) = 𝐸 (𝑋 2 ) − (𝐸 (𝑋))

= 12 − 32 = 9
Example-13

Solution:

Given pdf of RV 𝑋 is

𝑓 (𝑥 ) = 3𝑥 2 ,0≤ 𝑥 ≤ 1

(i) 𝑃(𝑋 ≤ 𝑎) = 𝑃(𝑋 > 𝑎)


𝑎 1

∴ ∫ 𝑓 (𝑥 )𝑑𝑥 = ∫ 𝑓 (𝑥 )𝑑𝑥
0 𝑎

𝑎 1

∴ ∫ 3𝑥 2 𝑑𝑥 = ∫ 3𝑥 2 𝑑𝑥
0 𝑎

[𝑥 3 ]𝑎0 = [𝑥 3 ]1𝑎

MVS Page 44
𝑎3 = 1 − 𝑎3
2𝑎3 = 1
1
𝑎3 =
2
∴ 𝑎 = 0.7937

MVS Page 45
Example-14

The distribution function of a RV X is given by

𝐹 (𝑥 ) = 1 − (1 + 𝑥 )𝑒 −𝑥 , 𝑥 ≥ 0
Find the pdf (𝑓(𝑥)) , mean and variance of 𝑋

Solution:

By the property of Cumulative distribution function 𝐹(𝑥) of a random


variable 𝑋

We have

𝐹 ′ (𝑥 ) = 𝑓 (𝑥 )
𝑑
i.e (𝐹(𝑥)) = 𝑓(𝑥)
𝑑𝑥

=
𝑑
𝑓 (𝑥 ) = {1 − (1 + 𝑥 )𝑒 −𝑥 }
𝑑𝑥
𝑓 (𝑥 ) = (1 + 𝑥 )𝑒 −𝑥 − 𝑒 −𝑥
∴ 𝑓 (𝑥 ) = 𝑥𝑒 −𝑥

MVS Page 46

𝐸 (𝑋) = 𝑀𝑒𝑎𝑛 = ∫ 𝑥 𝑓(𝑥 )𝑑𝑥


0

∞ ∞
= ∫0 𝑥 𝑥𝑒 −𝑥 𝑑𝑥 = ∫0 𝑥 2 𝑒 −𝑥 𝑑𝑥 ∞

Γ(𝑛) = ∫ 𝑥 𝑛−1 𝑒 −𝑥 𝑑𝑥
0

𝛤(𝑛 + 1) = 𝑛!
= Γ(3) = 2! = 2

𝐸 (𝑋 2 ) = ∫ 𝑥 2 𝑓(𝑥 )𝑑𝑥
0
∞ ∞

= ∫ 𝑥 2 𝑥𝑒 −𝑥 𝑑𝑥 = ∫ 𝑥 3 𝑒 −𝑥 𝑑𝑥
0 0

= Γ(4) = 3! = 6

2
𝑉 (𝑋) = 𝐸 (𝑋 2 ) − (𝐸 (𝑋))

=6−4=2

MVS Page 47
Example-15

𝑃 (𝐴∩𝐵)
Solution: We have 𝑃 (𝐴⁄𝐵 ) =
𝑃(𝐵)

1 7
∴ 𝑃 (( < 𝑋 < ) /𝑋 > 1) =
2 2

𝑃{(0.5 < 𝑋 < 3.5)⋂(𝑋 > 1)}


=
𝑃 (𝑋 > 1)
𝑃(𝑋 = 2) + 𝑃(𝑋 = 3)
𝑃(𝑋 = 2) + 𝑃(𝑋 = 3) + 𝑃(𝑋 = 4)
0.3+0.2 0.5 5
= = =
0.3+0.2+0.1 0.6 6

MVS Page 48
MVS Page 49

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy