0% found this document useful (0 votes)
352 views8 pages

Solution 1

The document summarizes the solution to an assignment on modeling weather as a Markov chain and using a sensor to make noisy observations. It includes: 1) Calculating the probability of a weather sequence and simulating the Markov chain. 2) Finding the stationary distribution through numerical simulation and closed form. 3) Calculating the entropy of the stationary distribution. 4) Using Bayes' rule to find the probability of past weather given current observation. 5) Explaining how adding seasons maintains the Markov property.

Uploaded by

nis13mo
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
352 views8 pages

Solution 1

The document summarizes the solution to an assignment on modeling weather as a Markov chain and using a sensor to make noisy observations. It includes: 1) Calculating the probability of a weather sequence and simulating the Markov chain. 2) Finding the stationary distribution through numerical simulation and closed form. 3) Calculating the entropy of the stationary distribution. 4) Using Bayes' rule to find the probability of past weather given current observation. 5) Explaining how adding seasons maintains the Markov property.

Uploaded by

nis13mo
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Computer Science 226

Solution to Assignment #1
(Winter 2006)

January 19, 2006

1. Suppose we live at a place where days are either sunny, cloudy, or rainy. The weather
transition function is a Markov chain with the following transition table:

tomorrow will be. . .


sunny cloudy rainy
sunny .8 .2 0
today it’s. . . cloudy .4 .4 .2
rainy .2 .6 .2

(a) Suppose Day 1 is a sunny day. What is the probability of the following sequence of days:
Day2 = cloudy, Day3 = cloudy, Day4 = rainy?
Answer The probability is 0.2 × 0.4 × 0.2 = 0.016
Grading X Evaluate an instantiation of a probability distribution.
(b) Write a simulator that can randomly generate sequences of “weathers” from this state
transition function.
Answer You may use any language you wish to implement the simulator.
% MATLAB script
% y = sample(x) returns y sampled from distribution x
X_today = [ 0 1 0 ]’; % Cloudy
T = [ 8 2 0 ; 4 4 2 ; 2 6 2 ]’ / 10;
X_tomorrow = sample( T * X_today );
if ( X_tomorrow(1) == 1 )
disp( ’sunny’ );
elseif ( X_tomorrow(2) == 1 )
disp( ’cloudy’ );
elseif ( X_tomorrow(3) == 1 )
disp( ’rainy’ );
end
Grading X Sample from a probability distribution.
(c) Use your simulator to determine the stationary distribution of this Markov chain. The
stationary distribution measures the probability that a random day will be sunny, cloudy,
or rainy.

1
Answer Stationary distribution can be approximated by the frequency of weather far
into the future.
% MATLAB Script
% Burning-in
% (i.e. Remove samples that may be affected by initial condition)
for ( n = 1 : 10000 )
X = sample( T * X );
end

% Sampling
X_tally = [ 0 0 0 ]’;
for ( n = 1 : 10000 )
X = sample( T * X );
X_tally = X_tally + X;
end
X_tally = X_tally / 10000;
The above script produced Xtally = (0.6463, 0.2876, 0.0661)T
Grading X Numerically estimate stationary distribution.
(d) Can you devise a closed-form solution to calculating the stationary distribution based
on the state transition matrix above?
Answer Let xt be the weather probability distribution on day t and let T be the weather
transition matrix. Then, xt+1 = T xt . When we reach a stationary distribution,
xt+1 = xt = x̄ which implies that x̄ = T x̄. In other words, the stationary dis-
tribution x̄ is the eigenvector with a corresponding eigenvalue of 1. In our case,
x̄ = (0.642857, 0.285714, 0.071428)T .
Grading X Analytically compute stationary distribution.
(e) What is the entropy of the stationary distribution?
P
Answer Entropy of the system is − i x̄i log2 (x̄i ) = 1.198117
Grading X Definition of entropy.
(f) Using Bayes rule, compute the probability table of yesterday’s weather given today’s
weather. (It is okay to provide the probabilities numerically, and it is also okay to rely
on results from previous questions in this exercise.)
Answer By Bayes rule
p(xt−1 |xt ) = ηp(xt |xt−1 )p(xt−1 )
p(xt |xt−1 )p(xt−1 )
=P
x0 p(xt |x0t−1 )p(x0t−1 )
t−1

Using stationary distribution as prior, we get


yesterday was. . .
sunny cloudy rainy
sunny 0.8 0.17778 0.02222
today it’s. . . cloudy 0.45 0.4 0.15
rainy 0 0.8 0.2

2
Using uniform distribution as prior, we get
yesterday was. . .
sunny cloudy rainy
sunny 4/7 2/7 1/7
today it’s. . . cloudy 1/6 1/3 1/2
rainy 0 1/2 1/2

Grading X Bayes rule


(g) Suppose we added seasons to our model. The state transition function above would only
apply to the Summer, whereas different ones would apply to Winter, Spring, and Fall.
Would this violate the Markov property of this process? Explain your answer.
Answer The system still satisfies Markov assumption. It can be viewed as a time-
varying Markov Chain (though time-varying, the value of xt+1 is still determined
by xt only).
Alternatively, add season st as an additional state variable of four values. By ex-
tending the weather transition matrix to 12 × 12, we can still account for all possible
transitions from {xt , st } to {xt+1 , st+1 }
p(xt+1 , st+1 |x1:t , s1:t ) = p(xt+1 , st+1 |xt , st )
Grading X Independence and Markov assumption

2. Suppose that we cannot observe the weather directly, but instead rely on a sensor. The prob-
lem is that our sensor is noisy. Its measurements are governed by the following measurement
model:
our sensor tells us. . .
sunny cloudy rainy
sunny .6 .4 0
the actual weather is. . . cloudy .3 .7 0
rainy 0 0 1

(a) Suppose Day 1 is sunny (this is known for a fact), and in the subsequent four days our
sensor observes cloudy, cloudy, rainy, sunny. What is the probability that Day 5 is
indeed sunny as predicted by our sensor?
Answer By applying Bayes’ rule and Markov assumption:
P (x5 |x1 , z2:5 ) = ηP (z5 |x5 , x1 , z2:4 )P (x5 |x1 , z2:4 )
= ηP (z5 |x5 )P (x5 |x1 , z2:4 )
Then
X
P (x5 |x1 , z2:4 ) = P (x4 , x5 |x1 , z2:4 )
x4
X
= P (x5 |x4 , x1 , z2:4 )P (x4 |x1 , z2:4 )
x4
X
= P (x5 |x4 )P (x4 |x1 , z2:4 )
x4

3
Note, however, that z4 = rainy implies that x4 = rainy.
X
P (x5 |x1 , z2:4 ) = P (x5 |x4 )P (x4 |x1 , z2:4 )
x4
= P (x5 |x4 = rainy) · 1

Altogether

P (x5 = sunny|x1 , z2:3 , z4 = rainy, z5 = sunny)


= ηP (z5 = sunny|x5 = sunny)P (x5 = sunny|x4 = rainy)
P (z5 = sunny|x5 = sunny)P (x5 = sunny|x4 = rainy)
= P 0 0
x0 P (z5 = sunny|x5 )P (x5 |x4 = rainy)
5
0.6 · 0.2
=
0.6 · 0.2 + 0.3 · 0.6 + 0
= 0.4

Grading X Bayes’ rule


X Incorporate state transition probability P (xt+1 |xt )
X Incorporate measurement transition probability P (zt |xt ).
X Correct probability
(b) Once again, suppose Day 1 is known to be sunny. At Days 2 through 4, the sensor
measures sunny, sunny, rainy. For each of the Days 2 through 4, what is the most likely
weather on that day? Answer the question in two ways: one in which only the data
available to the day in question is used, and one in hindsight, where data from future
days is also available.
Answer Day 2: 88.9% sunny, 11.1% cloudy, 0% rainy.

P (x2 |x1 , z2 ) = ηP (z2 |x2 , x1 )P (x2 |x1 )


= ηP (z2 |x2 )P (x2 |x1 )
   
0.6 0.8
= η  0.3  ·  0.2 
0 0
 
0.48
=η  0.06 
0
 
8/9
=  1/9 
0

4
Day 2 with data from future days: 80% sunny, 20% cloudy, 0% rainy
P (x2 |x1 , z2:4 ) = ηP (x2 |x1 )P (z2:4 |x2 , x1 )
= ηP (x2 |x1 )P (z2:4 |x2 )
= ηP (x2 |x1 )P (z2 |z3:4 , x2 )P (z3:4 |x2 )
= ηP (x2 |x1 )P (z2 |x2 )P (z3:4 |x2 )
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 , z3:4 |x2 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3:4 |x3 , x2 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3:4 |x3 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 )P (z4 |z3 , x3 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 )P (z4 |x3 )
x3
X X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 , z4 |x3 )
x3 x4
X X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 |x3 )P (z4 |x4 , x3 )
x3 x4
X X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 |x3 )P (z4 |x4 )
x3 x4
 
X X 0
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 |x3 ) ·  0 
x3 x4 1 P (x |z )
4 4
 
X 0
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) ·  0.2 
x3 0.2 P 0 (x |z )
3 4
 
X 0
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 ) ·  0.06 
x3 0 P 0 (x3 |z3 ,z4 )
 
0.012
= ηP (x2 |x1 )P (z2 |x2 ) ·  0.024 
0.036 P 0 (x |z ,z )
2 3 4
     
0.8 0.6 0.012
= η  0.2  ·  0.3  ·  0.024 
0 P 0 (x2 |x1 )
0 P 0 (x2 |z2 )
0.036 P 0 (x |z ,z )
2 3 4
 
0.00576
= η  0.00144 
0
 
0.8
=  0.2 
0 5
I misinterpreted ”future days” as ”the following day” when posting the FAQ on the
web. In which case, probability of weather on day 2 given data from only day 3 is:
92.3% sunny, 7.7% cloudy, 0.0% rainy.

P (x2 |x1 , z2 , z3 ) = ηP (x2 |x1 )P (z2 , z3 |x2 , x1 )


= ηP (x2 |x1 )P (z2 , z3 |x2 )
= ηP (x2 |x1 )P (z2 |z3 , x2 )P (z3 |x2 )
= ηP (x2 |x1 )P (z2 |x2 )P (z3 |x2 )
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 , z3 |x2 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 , x2 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 )
x3
 
X 0.6
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )  0.3 
x3 0 P 0 (x3 |z3 )
 
0.60
= ηP (x2 |x1 )P (z2 |x2 )  0.36 
0 P 0 (x2 |z3 )
     
0.8 0.6 0.54
= η  0.2  ·  0.3  ·  0.36 
0 P 0 (x2 |x1 )
0 P 0 (x2 |z2 )
0 P 0 (x2 |z3 )
 
2592/2808
= η  216/2808 
0
 
0.9231
=  0.0769 
0

6
Day 3: 87.2% sunny, 12.8% cloudy, 0% rainy.

P (x3 |x1 , z2:3 ) = ηP (z3 |x3 , x1 , z2 )P (x3 |x1 , z2 )


X
= ηP (z3 |x3 ) P (x3 , x2 |x1 , z2 )
x2
X
= ηP (z3 |x3 ) P (x3 |x2 )P (x2 |x1 , z2 )
x2


X 8
= ηP (z3 |x3 ) P (x3 |x2 )  1 
x2 0 P 0 (x |x ,z )
2 1 2
 
6.8
= ηP (z3 |x3 )  2.0 
0.2 P 0 (x |x ,z )
3 1 2
   
0.6 6.8
= η  0.3  ·  2.0 
0 0.2
   
408/468 0.8718
=  60/468  =  0.1282 
0 0

Day 3 with data from future days: 0% sunny, 100% cloudy, 0% rainy

P (x3 |x1 , z2:4 ) = ηP (x3 |x1 , z2:3 )P (z4 |x3 , x1 , z2:3 )


= ηP (x3 |x2 , z3 )P (z4 |x3 )
   
0.8395 0
= η  0.1605  ·  0.2 
0 0.2
 
0
= 1 
0

Day 4: 0% sunny, 0% cloudy, 100% rainy

P (x4 |x1 , z2:4 ) = P (x4 |x3 , z4 )


= ηP (z4 |x4 )P (x4 |x3 )
 
0
= 0 
1

Grading X Incorporate initial condition P (xk |x1 , · · · )


X Incorporate previous sensor readings P (xk |z1:k , · · · )
XXX Incorporate future sensor readings P (xk |zk+1:4 , · · · )
X Correct weather predictions

7
X (bonus) e.g. implementing Bayes filter, etc.
(c) Consider the same situation (Day 1 is sunny, the measurements for Days 2, 3, and 4 are
sunny, sunny, rainy). What is the most likely sequence of weather for Days 2 through
4? What is the probability of this most likely sequence?
Answer The probability of the sequence of weather is given by

P (x2:4 |x1 , z2:4 ) = ηP (z2:4 |x1 , x2:4 )P (x2:4 |x1 )

where

P (x2:4 |x1 ) = P (x4 |x3 )P (x3 |x2 )P (x2 |x1 )

and

P (z2:4 |x1 , x2:4 ) = P (z4 |x4 )P (z3 |x3 )P (z2 |x2 )

Hence, the most likely sequence of weather is sunny, cloudy, rainy which has 0.00576÷
(0.00576+0.00144) = 80% of occurring. There is a 20% probability of cloudy, cloudy, rainy
and 0% probability for all other sequences of weather.
Grading X Factorize P (x2:4 |x1 , z2:4 )
X Correct weather sequence and probability reading P (zt |xt )

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy