Discrete-Time Markov Chains: ELEC345 1
Discrete-Time Markov Chains: ELEC345 1
ELEC345 1
Introduction
• Markov chains model phenomena or systems with states
and transitions between states.
• States can represent states such as:
– ON-OFF traffic sources
– The number of packets in a buffer.
– Google also uses Markov chains to rank searches.
ELEC345 2
Example Chain
• Variable bit rate source (e.g. coded video
source):
2 Mbps 6 Mbps
In discrete time the chain can only change state at the arrows
Time
ELEC345 5
Probability density
• Repeating a random
experiment allows a
histogram to be obtained
• As the number of experiments
increases the histogram
approaches a theoretical curve
(Probability density)
ELEC345 10
Steady state
• Typically, the chain starts (0 time) in a given state with a
certain probability and with each time step the states change
according to the probabilities in P.
• The time steps are numbered 0,1,2,3, … , n,...
• After a long period of time a steady state condition is reached
for which there is a probability of being in a particular state.
State 2
State 1
0 1 2 3 4 5 6
ELEC345 11
Two-state example
The chain starts in State 1
1
Probability in state 1
2
1
0 Probability in state 2
0 1 2 3 4 5 6 7 8 9 10 11 12 13 … Steady
state
ELEC345 12
Steady state probabilities
• The steady state probability of being in state i is called i (i=1,2)
• With N states there are N steady state probabilities:
1 = Prob (State = 1)
2 = Prob (State = 2)
…
N = Prob (State = N)
These can be put into row vector form
= (1, 2, … ,N)
ELEC345 13
Equations for steady state
probabilities - Two-state example
• In steady state the probability of being in state 1 is equal
to:
– The probability of being in state 1 in the previous time slot and
making a transition to itself with probability P11
plus
– the probability of being in state 2 in the previous time slot and
making a transition to state 1 with probability P21.
• This gives an equilibrium equation for state 1:
1 1.P11 2.P 21
ELEC345 14
Picture (State 1)
1 1.P11 2.P 21
ELEC345 15
In words
1 1.P11 2.P 21
Prob of being Prob of being X Prob of
=
in state 1 in state 1 transition from (A)
1 to 1
+ Prob of being X
Prob of
in state 2 transition from 2 (B)
n.b. the two events: to 1
(A) In state 1 and transition from 1 to 1
(B) In state 2 and transition from 2 to 1
cover every possibility are mutually exclusive, so and the probabilities add
ELEC345 16
Picture (State 2)
2 1.P12 2.P 22
ELEC345 17
Conditioning
2.P 21
Prob of being X Prob of
in state 2 transition from
2 to 1
A = Previous state is 2
B = Current state is 1 ELEC345 18
Balance equations
• The equilibrium equations are also called balanced equations.
• Rewritten together they are
1 1.P11 2.P 21
2 1.P12 2.P 22
• We often write this in matrix form as
P11 P12
1 2 1 2
P 21 P 22
ELEC345 20
Solving for steady state probabilities
• Our goal is to find 0 and 1.
ELEC345 21
Direction solution
• Solving the balance equations gives answer gives 0 in
terms of 1 as the equations are linearly dependent (see
next slide)
• We must add the requirement that 0 + 1 = 1
(normalisation requirement) to find the unique solution for
0 and 1.
ELEC345 22
Two-state example
• Solving the balance equations gives
1 0.8 1 0.4 2
2 0.2 1 0.6 2
0.2 1 0.4 2
0.4 2 0.2 1
1 2 2
• Hence the equations are linearly dependent as there is an infinity of
solutions for 1, 2. We add the normalisation equation to find a unique
solution
1 2 1
ELEC345 23
Unique solution
• The balance equations give
1 2 2
• Inserting this into the normalisation equation 1 2 1 gives
2 2 2 1
i.e.
3 2 1
Solution is thus:
2 1/ 3
1 2 / 3
ELEC345 24
Verification
Solution:
1 2 / 3
2 1/ 3
Equations:
1 0.8 1 0.4 2
2 0.2 1 0.6 2
1 1 2
Verification:
2 / 3 0.8* 2 / 3 0.4*1/ 3
1/ 3 0.2* 2 / 3 0.6*1/ 3
1 2 / 3 1/ 3 25
Average data rate
• 2 state example
Average data rate
=Prob(in state 1) * Rate in state 1 Prob(in state 2) * Rate in state 2
1* 2 2*6
(2/3)*2 +(1/3)*6 = 10/3Mbps
ELEC345 26
Matrix method of solution
• We will develop a technique that can be used to solve for
arbitrary chains. We use the example as a guide:
• Start with the balance equations together with the
normalisation equation:
1 1.P11 2.P 21
2 1.P12 2.P 22
1 1 2
0 P11 P 21 1 0
1
0 P12 P 22 0 1
1 2
1 1
ELEC345 28
• Write the matrix equation in block matrix form using
symbols for known inner matrices:
0 P ' I
'
1 1
0
0
0
1 1 1 (n.b. 1 on the LHS is just "one")
1 0
I identity matrix
0 1
P11 P 21
P' transpose of probability transition matrix P
P12 P 22
1
' transpose of probability transition vector
2
ELEC345 29
• We can write this equation in terms of the matrix Q and
vector b where we have to solve for x
b Qx
0
b
1
P ' I
Q
1
x '
ELEC345 30
Matlab solution
• The Matlab code to find the steady state probabilities for the example
is:
N = 2; % Number of states
P = [0.8,0.2;0.4,0.6]; % Prob. Transition matrix for chain
Q = [P’ – eye(N);ones(1,N)]; % eye creates identity matrix
% ones creates matrices of all 1’s
b = [zeros(N,1);1]; % zeros creates matrices of all 0’s
x = linsolve(Q,b); % linsolve solves linear equations
fprintf(‘pi1=%g pi2=%g\n’,x(1),x(2))
31
Markov chain structures
• State x communicates with state y if it is possible to get
from x to y in a finite number of transitions and from y to x
in a finite number of transitions.
• A chain is irreducible if every state communicates with
every other state
– It is possible to have disconnected sets of states in a chain.
• A state is called recurrent if it is returned to an infinite
number of times (with probability 1).
• Otherwise the state is called transient – it is only visited a
finite number of times
• Some chains can have absorbing states which once entered
ELEC345 32
cannot be left.
Examples
1 2 1 2 1 2
3 4 3 4 3 4
Not irreducible Not irreducible Irredicible
(e.g. 1 and 4 do not communicate) (e.g. Cannot go from 4 to 1)
8 1 2 5 6 7