Solution 1
Solution 1
Solution to Assignment #1
(Winter 2006)
1. Suppose we live at a place where days are either sunny, cloudy, or rainy. The weather
transition function is a Markov chain with the following transition table:
(a) Suppose Day 1 is a sunny day. What is the probability of the following sequence of days:
Day2 = cloudy, Day3 = cloudy, Day4 = rainy?
Answer The probability is 0.2 × 0.4 × 0.2 = 0.016
Grading X Evaluate an instantiation of a probability distribution.
(b) Write a simulator that can randomly generate sequences of “weathers” from this state
transition function.
Answer You may use any language you wish to implement the simulator.
% MATLAB script
% y = sample(x) returns y sampled from distribution x
X_today = [ 0 1 0 ]’; % Cloudy
T = [ 8 2 0 ; 4 4 2 ; 2 6 2 ]’ / 10;
X_tomorrow = sample( T * X_today );
if ( X_tomorrow(1) == 1 )
disp( ’sunny’ );
elseif ( X_tomorrow(2) == 1 )
disp( ’cloudy’ );
elseif ( X_tomorrow(3) == 1 )
disp( ’rainy’ );
end
Grading X Sample from a probability distribution.
(c) Use your simulator to determine the stationary distribution of this Markov chain. The
stationary distribution measures the probability that a random day will be sunny, cloudy,
or rainy.
1
Answer Stationary distribution can be approximated by the frequency of weather far
into the future.
% MATLAB Script
% Burning-in
% (i.e. Remove samples that may be affected by initial condition)
for ( n = 1 : 10000 )
X = sample( T * X );
end
% Sampling
X_tally = [ 0 0 0 ]’;
for ( n = 1 : 10000 )
X = sample( T * X );
X_tally = X_tally + X;
end
X_tally = X_tally / 10000;
The above script produced Xtally = (0.6463, 0.2876, 0.0661)T
Grading X Numerically estimate stationary distribution.
(d) Can you devise a closed-form solution to calculating the stationary distribution based
on the state transition matrix above?
Answer Let xt be the weather probability distribution on day t and let T be the weather
transition matrix. Then, xt+1 = T xt . When we reach a stationary distribution,
xt+1 = xt = x̄ which implies that x̄ = T x̄. In other words, the stationary dis-
tribution x̄ is the eigenvector with a corresponding eigenvalue of 1. In our case,
x̄ = (0.642857, 0.285714, 0.071428)T .
Grading X Analytically compute stationary distribution.
(e) What is the entropy of the stationary distribution?
P
Answer Entropy of the system is − i x̄i log2 (x̄i ) = 1.198117
Grading X Definition of entropy.
(f) Using Bayes rule, compute the probability table of yesterday’s weather given today’s
weather. (It is okay to provide the probabilities numerically, and it is also okay to rely
on results from previous questions in this exercise.)
Answer By Bayes rule
p(xt−1 |xt ) = ηp(xt |xt−1 )p(xt−1 )
p(xt |xt−1 )p(xt−1 )
=P
x0 p(xt |x0t−1 )p(x0t−1 )
t−1
2
Using uniform distribution as prior, we get
yesterday was. . .
sunny cloudy rainy
sunny 4/7 2/7 1/7
today it’s. . . cloudy 1/6 1/3 1/2
rainy 0 1/2 1/2
2. Suppose that we cannot observe the weather directly, but instead rely on a sensor. The prob-
lem is that our sensor is noisy. Its measurements are governed by the following measurement
model:
our sensor tells us. . .
sunny cloudy rainy
sunny .6 .4 0
the actual weather is. . . cloudy .3 .7 0
rainy 0 0 1
(a) Suppose Day 1 is sunny (this is known for a fact), and in the subsequent four days our
sensor observes cloudy, cloudy, rainy, sunny. What is the probability that Day 5 is
indeed sunny as predicted by our sensor?
Answer By applying Bayes’ rule and Markov assumption:
P (x5 |x1 , z2:5 ) = ηP (z5 |x5 , x1 , z2:4 )P (x5 |x1 , z2:4 )
= ηP (z5 |x5 )P (x5 |x1 , z2:4 )
Then
X
P (x5 |x1 , z2:4 ) = P (x4 , x5 |x1 , z2:4 )
x4
X
= P (x5 |x4 , x1 , z2:4 )P (x4 |x1 , z2:4 )
x4
X
= P (x5 |x4 )P (x4 |x1 , z2:4 )
x4
3
Note, however, that z4 = rainy implies that x4 = rainy.
X
P (x5 |x1 , z2:4 ) = P (x5 |x4 )P (x4 |x1 , z2:4 )
x4
= P (x5 |x4 = rainy) · 1
Altogether
4
Day 2 with data from future days: 80% sunny, 20% cloudy, 0% rainy
P (x2 |x1 , z2:4 ) = ηP (x2 |x1 )P (z2:4 |x2 , x1 )
= ηP (x2 |x1 )P (z2:4 |x2 )
= ηP (x2 |x1 )P (z2 |z3:4 , x2 )P (z3:4 |x2 )
= ηP (x2 |x1 )P (z2 |x2 )P (z3:4 |x2 )
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 , z3:4 |x2 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3:4 |x3 , x2 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3:4 |x3 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 )P (z4 |z3 , x3 )
x3
X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 )P (z4 |x3 )
x3
X X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 , z4 |x3 )
x3 x4
X X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 |x3 )P (z4 |x4 , x3 )
x3 x4
X X
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 |x3 )P (z4 |x4 )
x3 x4
X X 0
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) P (x4 |x3 ) · 0
x3 x4 1 P (x |z )
4 4
X 0
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 )P (z3 |x3 ) · 0.2
x3 0.2 P 0 (x |z )
3 4
X 0
= ηP (x2 |x1 )P (z2 |x2 ) P (x3 |x2 ) · 0.06
x3 0 P 0 (x3 |z3 ,z4 )
0.012
= ηP (x2 |x1 )P (z2 |x2 ) · 0.024
0.036 P 0 (x |z ,z )
2 3 4
0.8 0.6 0.012
= η 0.2 · 0.3 · 0.024
0 P 0 (x2 |x1 )
0 P 0 (x2 |z2 )
0.036 P 0 (x |z ,z )
2 3 4
0.00576
= η 0.00144
0
0.8
= 0.2
0 5
I misinterpreted ”future days” as ”the following day” when posting the FAQ on the
web. In which case, probability of weather on day 2 given data from only day 3 is:
92.3% sunny, 7.7% cloudy, 0.0% rainy.
6
Day 3: 87.2% sunny, 12.8% cloudy, 0% rainy.
Day 3 with data from future days: 0% sunny, 100% cloudy, 0% rainy
7
X (bonus) e.g. implementing Bayes filter, etc.
(c) Consider the same situation (Day 1 is sunny, the measurements for Days 2, 3, and 4 are
sunny, sunny, rainy). What is the most likely sequence of weather for Days 2 through
4? What is the probability of this most likely sequence?
Answer The probability of the sequence of weather is given by
where
and
Hence, the most likely sequence of weather is sunny, cloudy, rainy which has 0.00576÷
(0.00576+0.00144) = 80% of occurring. There is a 20% probability of cloudy, cloudy, rainy
and 0% probability for all other sequences of weather.
Grading X Factorize P (x2:4 |x1 , z2:4 )
X Correct weather sequence and probability reading P (zt |xt )