Pert15 - Probabilistic Reasoning Over Time
Pert15 - Probabilistic Reasoning Over Time
Non-official Slides
Session 15
2
Outline
1. Time and Uncertainty
2. Inference in Temporal Model
3. Hidden Markov Models
4. Dynamic Bayesian Networks
5. Summary
3
Time and Uncertainty
• How we estimate the probability of changing random
variable?
– When a car is broken, remains broken during the process
diagnosis (static)
– On the other hand, a diabetic patient has changing
evidence (blood sugar, insulin doses, etc) (dynamic)
• We view the world as a series of snapshots (time slices)
– Xt denotes the set of state variables at time t
– Et denotes the observable evidences at time t
4
Time and Uncertainty
• Simple example
5
Time and Uncertainty
• How we construct the Bayesian network? What is the transition
model?
First-order
Second-order
6
Time and Uncertainty
First-order
Second-order
7
Time and Uncertainty
• The complete joint distribution is the combination of the
transition model and sensor model
9
Markov Chain
• A Markov process has 3 states, with the transition matrix
10
Markov Chain
• Example
– A child with a lower-class parent has a 60% chance of
remaining in the lower class, has a 40% chance to rise to the
middle class, and has no chance to reach the upper class. A
child with a middle-class parent has a 30% chance of falling
to the lower class, a 40% chance of remaining middle class,
and a 30% chance of rising to the upper class. Finally, a child
with an upper-class parent have no chance of falling to the
lower class, has a 70% chance of falling to the middle class,
and has a 30% chance of remaining in the upper class.
– Assuming that 20% of the population belongs to the lower
class, that 30% belong to the middle class, and that 50%
belong to the upper class.
11
Markov Chain
• Solution
• Transition matrix
• Transition diagram
12
Markov Chain
• Solution
13
Inference in Temporal Model
• Inference tasks:
14
Inference in Temporal Model
• Filtering and prediction
new evidence
– Prediction can be seen simply as filtering without the
addition of new evidence
15
Inference in Temporal Model
• Filtering
16
Inference in Temporal Model
• Smoothing
forward backward
17
Inference in Temporal Model
• Smoothing
18
Inference in Temporal Model
• Smoothing
19
Inference in Temporal Model
• Most likely explanation
• How?
20
Inference in Temporal Model
• Most likely explanation
21
Inference in Temporal Model
• Most likely explanation
22
Hidden Markov Models
• Simple Markov models The observer know the state
directly
23
Hidden Markov Models
Hidden states
H1 H2 Hi HL-1 HL
X1 X2 Xi XL-1 XL
Observed data
24
Hidden Markov Models
0.9
0.9
0.1 transition probabilities
fair loaded
0.1
1/2 1/2 3/4 1/4 emission probabilities
H T H T
Fair/Loaded
H1 H2 Hi HL-1 HL
X1 X2 Xi XL-1 XL
Head/Tail
25
Hidden Markov Models
We don’t know
the location, but
we know the
output of the
sensors
26
Dynamic Bayesian Network
• A dynamic Bayesian network or DBN is a Bayesian network
that represents a temporal probability model
27
Dynamic Bayesian Network
• To construct a DBN, we must specify three kinds of
information:
28
Dynamic Bayesian Network
• Example
– State:
• Measurement state
29
Dynamic Bayesian Network
30
Summary
• The changing state of the world is handled by using a set of
random variables
31
References
• Stuart Russell, Peter Norvig. 2010. Artificial Intelligence : A
Modern Approach. Pearson Education. New Jersey.
ISBN:9780132071482
• http://aima.cs.berkeley.edu
32