0% found this document useful (0 votes)
53 views8 pages

2.1 Joint Probability Disribution

Uploaded by

shreyadevraj874
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views8 pages

2.1 Joint Probability Disribution

Uploaded by

shreyadevraj874
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Complex Analysis, Probability and Statistical Methods

(Subject code: BCS301)

Module 2: Joint probability distribution and Markov chain

__________________________________________________________________
Syllabus:

2.1 Joint Probability distribution


Introduction:

❖ Let 𝑋 = {𝑥1 , 𝑥2 , … , 𝑥𝑚 } and 𝑌 = {𝑦1 , 𝑦2 , … , 𝑦𝑛 } be two discrete random variables. Then


𝑃(𝑥, 𝑦) = 𝐽𝑖𝑗 is called joint probability function of X and Y if it satisfies the conditions:
𝑚 𝑛

(𝑖) 𝐽𝑖𝑗 ≥ 0 (𝑖𝑖) ∑ ∑ 𝐽𝑖𝑗 = 1


𝑖=1 𝑗=1
.
❖ Set of values of this joint probability function 𝐽𝑖𝑗 is called joint probability distribution of
X and Y.
X\Y 𝑦1 𝑦2 … 𝑦𝑛 𝑆𝑢𝑚
𝑥1 𝐽11 𝐽12 … 𝐽1𝑛 𝑓(𝑥1 )
𝑥2 𝐽21 𝐽22 … 𝐽2𝑛 𝑓(𝑥2 )
… … … … … …
𝑥𝑚 𝐽𝑚1 𝐽𝑚2 … 𝐽𝑚𝑛 𝑓(𝑥𝑚 )
𝑆𝑢𝑚 𝑔(𝑦1 ) 𝑔(𝑦2 ) … 𝑔(𝑦𝑛 ) 𝑇𝑜𝑡𝑎𝑙 = 1

❖ Marginal probability distribution of X


𝑥1 𝑥2 … 𝑥𝑛
𝑓(𝑥1 ) 𝑓(𝑥2 ) … 𝑓(𝑥𝑛 )

Where 𝑓(𝑥1 ) + 𝑓(𝑥2 ) + ⋯ + 𝑓(𝑥𝑛 ) = 1

Dr. Narasimhan G, RNSIT 1


❖ Marginal probability distribution of Y
𝑦1 𝑦2 … 𝑦𝑛
𝑔(𝑦1 ) 𝑔(𝑦2 ) … 𝑔(𝑦𝑛 )

Where 𝑔(𝑦1 ) + 𝑔(𝑦2 ) + ⋯ + 𝑔(𝑦𝑛 ) = 1


❖ The discrete random variables X and Y are said to be independent random variables if
𝑓(𝑥𝑖 )𝑔(𝑦𝑗 ) = 𝐽𝑖𝑗 .

Important results:
❖ Expectations:
𝑚 𝑛 𝑚 𝑛

𝐸(𝑥) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) 𝐸(𝑦) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) 𝐸(𝑥𝑦) = ∑ ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗


𝑖=1 𝑗=1 𝑖=1 𝑗=1

❖ Covariance:
𝐶𝑜𝑣(𝑥, 𝑦) = 𝐸(𝑥𝑦) − 𝐸(𝑥)𝐸(𝑦)

❖ Variance:
𝑉𝑎𝑟(𝑥) = 𝐸(𝑥 2 ) − [𝐸(𝑥)]2 𝑉𝑎𝑟(𝑦) = 𝐸(𝑦 2 ) − [𝐸(𝑦)]2

❖ Standard deviation:
𝜎𝑥 = √𝑉𝑎𝑟(𝑥) 𝜎𝑦 = √𝑉𝑎𝑟(𝑦)

❖ Correlation of X and Y:
𝐶𝑜𝑣(𝑥, 𝑦)
𝜌(𝑥, 𝑦) =
𝜎𝑥 𝜎𝑦

❖ If X and Y are independent then 𝐸(𝑥𝑦) = 𝐸(𝑥)𝐸(𝑦).

Dr. Narasimhan G, RNSIT 2


X\Y -4 2 7
1. The joint distribution of random variables X and Y is
1 1/8 1/4 1/8
5 1/4 1/8 1/8
Find (i) Marginal distribution of X and Y. (ii) 𝑬(𝒙), 𝑬(𝒚) (iii) Are X and Y
independent random variables? (iv) 𝑪𝒐𝒗(𝒙, 𝒚) (v) 𝝈𝒙 , 𝝈𝒚 (vi) 𝝆(𝒙, 𝒚)

𝑥\𝑦 −4 2 7 𝑓(𝑥)
1 1/8 1/4 1/8 1/2
5 1/4 1/8 1/8 1/2
𝑔(𝑦) 3/8 3/8 1/4 𝑇𝑜𝑡𝑎𝑙 = 1

(i) Marginal probability distribution of X: 𝑥 1 5


𝑓(𝑥) 1/2 1/2

Marginal probability distribution of Y: 𝑦 −4 2 7


1 1 𝑔(𝑦) 3/8 3/8 1/4
(ii) 𝐸(𝑥) = Σ𝑥𝑓(𝑥) = 1 (2) + 5 (2) = 3
3 3 1
𝐸(𝑦) = Σ𝑦𝑔(𝑦) = −4 (8) + 2 (8) + 7 (4) = 1

(iii) 𝐸(𝑥𝑦) = ΣΣ𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗


1 1 1 1 1 1
= 1(−4) (8) + 1(2) (4) + 1(7) (8) + 5(−4) (4) + 5(2) (8) + 5(7) (8)
4 4 7 10 35 3
= −8+8+8−5 + + =2
8 8

𝐸(𝑥)𝐸(𝑦) = 3(1) =3. Therefore, 𝐸(𝑥𝑦) ≠ 𝐸(𝑥)𝐸(𝑦).


Therefore, 𝑥 𝑎𝑛𝑑 𝑦 are not independent variables.
3 3
(iv) 𝐶𝑜𝑣 (𝑥, 𝑦) = 𝐸(𝑥𝑦) − 𝐸(𝑥)𝐸(𝑦) = 2 − 3(1) = − 2 .
1 1
(v) 𝐸(𝑥 2 ) = Σ𝑥 2 𝑓(𝑥) = 12 (2) + 52 (2) = 13
3 3 1 79
𝐸 (𝑦 2 ) = Σ𝑦 2 𝑔(𝑦) = (−4)2 (8) + 22 (8) + 72 (4) = 4

𝜎𝑥 = √𝐸(𝑥 2 ) − [𝐸(𝑥)]2 = √13 − 32 = 2


79
𝜎𝑦 = √𝐸(𝑦 2 ) − [𝐸(𝑦)]2 = √ 4 − 12 = 4.33

𝐶𝑜𝑣(𝑥,𝑦) 1.5
(vi) 𝜌(𝑥, 𝑦) = = − 8.66 = −0.1732
𝜎𝑥 𝜎𝑦

Dr. Narasimhan G, RNSIT 3


x y -3 2 4
2. Find the joint distribution of X and Y is as follows:
1 0.1 0.2 0.2
3 0.3 0.1 0.1

Find (i) Marginal distribution of X and Y. (ii) 𝑬(𝒙), 𝑬(𝒚) (iii) Are X and Y
independent random variables? (iv) 𝑪𝒐𝒗(𝒙, 𝒚) (v) 𝝈𝒙 , 𝝈𝒚 (vi) 𝝆(𝒙, 𝒚)
𝑥\𝑦 −3 2 4 𝑓(𝑥)
1 0.1 0.2 0.2 0.5
3 0.3 0.1 0.1 0.5
𝑔(𝑦) 0.4 0.3 0.3 𝑇𝑜𝑡𝑎𝑙 = 1
(i) Marginal probability distribution of X: 𝑥 1 3
𝑓(𝑥) 0.5 0.5

𝑦 −3 2 4
Marginal probability distribution of Y:
𝑔(𝑦) 0.4 0.3 0.3

(ii) 𝐸(𝑥) = Σ𝑥𝑓(𝑥) = 1(0.5) + 3(0.5) = 2

𝐸(𝑦) = Σ𝑦𝑔(𝑦) = −3(0.4) + 2(0.3) + 4(0.3) = 0.6

(iii) 𝐸(𝑥𝑦) = ΣΣ𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 = 1(−3)(0.1) + 1(2)(0.2) + 1(4)(0.2)


+3(−3)(0.3) + 3(2)(0.1) + 3(4)(0.1)
= −0.3 + 0.4 + 0.8 − 2.7 + 0.6 + 1.2 = 0
𝐸(𝑥𝑦) = 0 , 𝐸(𝑥)𝐸(𝑦) = 2(0.6) = 1.2
Therefore, 𝐸(𝑥𝑦) ≠ 𝐸(𝑥)𝐸(𝑦).
Therefore, 𝑥 𝑎𝑛𝑑 𝑦 are not independent variables.
(iv) 𝐶𝑜𝑣 (𝑥, 𝑦) = 𝐸(𝑥𝑦) − 𝐸(𝑥)𝐸(𝑦) = 0 − 1.2 = −1.2
(v) 𝐸(𝑥 2 ) = Σ𝑥 2 𝑓(𝑥) = 12 (0.5) + 32 (0.5) = 5
𝐸(𝑦 2 ) = Σ𝑦 2 𝑔(𝑦) = (−3)2 (0.4) + 22 (0.3) + 42 (0.3) = 9.6
𝜎𝑥 = √𝐸(𝑥 2 ) − [𝐸(𝑥)]2 = √5 − 22 = 1

𝜎𝑦 = √𝐸(𝑦 2 ) − [𝐸(𝑦)]2 = √9.6 − 0.62 = 3.0397


𝐶𝑜𝑣(𝑥,𝑦) 1.2
(vi) 𝜌(𝑥, 𝑦) = = − 3.0397 = −0.3948
𝜎𝑥 𝜎𝑦

Dr. Narasimhan G, RNSIT 4


3. Find the joint distribution of X and Y which are the independent random variables
with the following respective distributions.

𝒙𝒊 1 2 𝒚𝒋 -2 5 8
𝒇(𝒙𝒊 ) 0.7 0.3 𝒈(𝒚𝒋 ) 0.3 0.5 0.2

Since X and Y are independent random variables, 𝐽𝑖𝑗 = 𝑓(𝑥𝑖 )𝑔(𝑦𝑗 )


Therefore,
𝑥\𝑦 −2 5 8 𝑓(𝑥)
1 0.21 0.35 0.14 0.7
2 0.09 0.15 0.06 0.3
𝑔(𝑦) 0.3 0.5 0.2 𝑇𝑜𝑡𝑎𝑙 = 1

4. Consider the joint distribution of X and Y. X Y 0 1 2 3


Compute the following probabilities: 0 0 1/8 1/4 1/8
(i) 𝑷(𝑿 = 𝟏, 𝒀 = 𝟐) (ii) 𝑷(𝑿 ≥ 𝟏, 𝒀 ≥ 𝟐) 1 1/8 1/4 1/8 0
(iii) 𝑷(𝑿 ≤ 𝟏, 𝒀 ≤ 𝟐) (iv) 𝑷(𝑿 + 𝒀 ≥ 𝟐) (v) 𝑷(𝑿 ≥ 𝟏, 𝒀 ≤ 𝟐).

(i) 𝑋 = {0, 1}, 𝑌 = {0, 1, 2, 3, 4}


1
𝑃(𝑋 = 1, 𝑌 = 2) = 𝑃(1, 2) = 8

(ii) If 𝑋 ≥ 1, 𝑋 = {1}. If 𝑌 ≥ 2, 𝑌 = {2, 3}


1 1
𝑃(𝑋 ≥ 1, 𝑌 ≥ 2) = 𝑃(1, 2) + 𝑃(1, 3) = 8 + 0 = 8

(iii) If 𝑋 ≤ 1, 𝑋 = {0, 1}. If 𝑌 ≤ 2, 𝑌 = {0, 1, 2}


𝑃(𝑋 ≤ 1, 𝑌 ≤ 2) = 𝑃(0, 0) + 𝑃(0, 1) + 𝑃(0, 2) + 𝑃(1, 0) + 𝑃(1, 1) + 𝑃(1, 2)
1 1 1 1 1 7
=0+8+4+8+4+8=8

(iv) If 𝑋 + 𝑌 ≥ 2 then 𝑋 + 𝑌 = 0 + 2 𝑜𝑟 0 + 3 𝑜𝑟 1 + 1 𝑜𝑟 1 + 2 𝑜𝑟 1 + 3
𝑃(𝑋 + 𝑌 ≥ 2) = 𝑃(0, 2) + 𝑃(0, 3) + 𝑃(1, 1) + 𝑃(1, 2) + 𝑃(1, 3)
1 1 1 1 3
=4+8+4+8+0= 4

(v) If 𝑋 ≥ 1, 𝑋 = {1}. If 𝑌 ≤ 2, 𝑌 = {0, 1, 2}


1 1 1 1
𝑃(𝑋 ≥ 1, 𝑌 ≤ 2) = 𝑃(1, 0) + 𝑃(1, 1) + 𝑃(1, 2) = 8 + 4 + 8 = 2

Dr. Narasimhan G, RNSIT 5


5. A fair coin is tossed thrice. The random variables X and Y are defined as follows:
X=0 or 1 according as head or tail occur in the first toss. Y = Number of heads.
Determine (i) The distribution of X and Y. (ii) The joint distribution of X and Y.(iii)
The expectations of X and Y (iv) Standard deviation of X and Y (v) Covariance of X
and Y (vi) Correlation of X and Y.

𝑆 = {𝐻𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝐻𝑇𝑇, 𝑇𝐻𝐻, 𝑇𝐻𝑇, 𝑇𝑇𝐻, 𝑇𝑇𝑇}


𝑋 = {0, 1} 𝑎𝑛𝑑 𝑌 = {0, 1, 2, 3}
(i) Marginal distribution of X: Marginal distribution of Y:

0 1 0 1 2 3
1/2 1/2 1/8 3/8 3/8 1/8

(ii) 𝐽00 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 𝑛𝑜 ℎ𝑒𝑎𝑑𝑠) = 1/8


𝐽01 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 1 ℎ𝑒𝑎𝑑) = 2/8
𝐽02 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 2 ℎ𝑒𝑎𝑑𝑠) = 1/8
𝐽03 = 𝑃(𝐹𝑖𝑟𝑠𝑡 𝑡𝑎𝑖𝑙, 3 ℎ𝑒𝑎𝑑𝑠) = 0
𝐽10 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 𝑛𝑜 ℎ𝑒𝑎𝑑𝑠) = 0
𝐽11 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 1 ℎ𝑒𝑎𝑑) = 1/8
𝐽12 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 2 ℎ𝑒𝑎𝑑𝑠) = 2/8
𝐽13 = 𝑃(𝐹𝑖𝑟𝑠𝑡 ℎ𝑒𝑎𝑑, 3 ℎ𝑒𝑎𝑑𝑠) = 1/8
X Y 0 1 2 3
The joint distribution of X and Y: 0 1/8 2/8 1/8 0
1 0 1/8 2/8 1/8

(iii) 𝐸(𝑥) = Σ𝑥𝑓(𝑥) = 0(1/2) + 1(1/2) = 1/2


𝐸(𝑦) = Σ𝑦𝑔(𝑦) = 0(1/8) + 1(3/8) + 2(3/8) + 3(1/8) = 3/2

(iv) 𝐸(𝑥 2 ) = Σ𝑥 2 𝑓(𝑥) = 02 (1/2) + 12 (1/2) = 1/2


𝐸(𝑦 2 ) = Σ𝑦 2 𝑔(𝑦) = 02 (1/8) + 12 (3/8) + 22 (3/8) + 32 (1/8) = 3
𝜎𝑥 = √𝐸(𝑥 2 ) − [𝐸(𝑥)]2 = √1/2 − (1/2)2 = 1/2

𝜎𝑦 = √𝐸(𝑦 2 ) − [𝐸(𝑦)]2 = √3 − (3/2)2 = √3/2


1 2 1
(v) 𝐸(𝑋𝑌) = 0 + 0 + 0 + 0 + 0 + 1(1) (8) + 1(2) (8) + 1(3) (8) = 1
1 3 1
Covariance of X and Y: 𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 1 − (2) (2) = 4
𝐶𝑜𝑣(𝑋,𝑌) 1 2 1
(vi) Correlation of X and Y: 𝜌(𝑋, 𝑌) = = 4×2× = = 1.7321
𝜎𝑥 𝜎𝑦 √3 √3

Dr. Narasimhan G, RNSIT 6


6. The joint probability distribution of two discrete random variables X and Y is given
by 𝒇(𝒙, 𝒚) = 𝒌(𝟐𝒙 + 𝒚) for 𝟎 ≤ 𝒙 ≤ 𝟐 , 𝟎 ≤ 𝒚 ≤ 𝟑. (i) Find the value of 𝒌. (ii) The
marginal distribution of X and Y (iii) Show that X and Y are dependent.

By data, 𝑋 = {0, 1, 2} and 𝑌 = {0, 1, 2, 3}


𝑓(𝑥, 𝑦) = 𝑘(2𝑥 + 𝑦)
The joint probability distribution of X and Y:
X Y 0 1 2 3 𝑓(𝑋)
0 0 𝑘 2𝑘 3𝑘 6𝑘
1 2𝑘 3𝑘 4𝑘 5𝑘 14𝑘
2 4𝑘 5𝑘 6𝑘 7𝑘 22𝑘
g(Y) 6𝑘 9𝑘 12𝑘 15𝑘 42k

(i) Find the value of 𝑘:


1
1= Σ𝑓(𝑥, 𝑦) = 42𝑘 , 𝑘 = 42

(ii) Marginal probability distribution of X: 0 1 2


6/42 14/42 22/42

0 1 2 3
Marginal probability distribution of Y: 6/42 9/42 12/42 15/42

(iii) 𝐽𝑖𝑗 = 𝑓(𝑥𝑖 , 𝑦𝑗 ) = 𝑓(0, 1) = 𝑘

𝑓(𝑥𝑖 ) × 𝑔(𝑦𝑗 ) = 𝑓(0) × 𝑔(1) = 6𝑘 × 9𝑘

𝐽𝑖𝑗 ≠ 𝑓(𝑥𝑖 ) × 𝑔(𝑦𝑗 ). Therefore, X and Y are dependent.

Dr. Narasimhan G, RNSIT 7


7. The joint probability distribution of X and Y is given by 𝒇(𝒙, 𝒚) = 𝒄(𝒙𝟐 + 𝒚𝟐 ) for
𝒙 = −𝟏, 𝟎, 𝟏, 𝟑. 𝒂𝒏𝒅 𝒚 = −𝟏, 𝟐, 𝟑. (i) Find the value of 𝒄. (ii) 𝑷(𝒙 = 𝟎, 𝒚 ≤ 𝟐) (iii)
𝑷(𝒙 ≤ 𝟏, 𝒚 > 𝟐) (iv) 𝑷(𝒙 ≥ 𝟐 − 𝒚) .

By data, 𝑋 = {−1, 0, 1, 3} and 𝑌 = {−1, 2, 3}. 𝑓(𝑥, 𝑦) = 𝑐(𝑥 2 + 𝑦 2 )


The joint probability distribution of X and Y: X Y -1 2 3 𝑓(𝑋)
−1 2c 5𝑐 10𝑐 17𝑐
0 𝑐 4𝑐 9𝑐 14𝑐
1 2𝑐 5𝑐 10𝑐 17𝑐
3 10𝑐 13𝑐 18𝑐 41𝑐
1
g(Y) 15𝑐 27𝑐 47𝑐 89c
(i) Find 𝒄: 1= Σ𝑓(𝑥, 𝑦) = 89𝑐 , 𝑐 = 89

(ii) 𝑥 = 0, 𝑦 = {−1, 2}

𝑃(𝑥 = 0, 𝑦 ≤ 2) = 𝑃(0, −1) + 𝑃(0, 2) = 𝑐 + 4𝑐 = 5𝑐 = 5/89

(iii) 𝑥 = {−1, 0, 1}, 𝑦 = {3}

𝑃(𝑥 ≤ 1, 𝑦 > 2) = 𝑃(−1, 3) + 𝑃(0, 3) + 𝑃(1, 3)

= 10𝑐 + 9𝑐 + 10𝑐 = 29𝑐 = 29/89

(iv) 𝑃(𝑥 ≥ 2 − 𝑦) = 𝑃(𝑥 + 𝑦 ≥ 2)

= 𝑃(−1, 3) + 𝑃(0, 2) + 𝑃(0, 3) + 𝑃(1, 2) + 𝑃(1, 3)

+𝑃(3, −1) + 𝑃(3, 2) + 𝑃(3, 3)

= 10𝑐 + 4𝑐 + 9𝑐 + 5𝑐 + 10𝑐 + 10𝑐 + 13𝑐 + 18𝑐


79
= 79𝑐 = 89

Home work:

8. Two cards are selected from a box which contains 5 cards numbered 1, 1, 2, 2, 3. Find
the joint distribution of X and Y, where X denote the sum and Y denote the maximum of
two numbers drawn. Also determine 𝐶𝑜𝑣(𝑥, 𝑦).

9. The joint distribution of random variables X and Y is X\Y 1 3 6


Find Marginal distribution of X and Y. 1 1/9 1/6 1/18
Are X and Y independent random variables? 3 1/6 1/4 1/12
6 1/18 1/12 1/36
1 1 1
10. X and Y are independent random variables. X takes values 2, 5, 7 with probabilities , ,
2 4 4
1 1 1
respectively. Y take values 3, 4, 5 with probabilities , , .
3 3 3
(i) Find the joint probability
distribution of X and Y. (ii) Find covariance of X and Y (iii) Find the probability distribution of
𝑍 =𝑋+𝑌

Dr. Narasimhan G, RNSIT 8

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy